soft margin
Recently Published Documents


TOTAL DOCUMENTS

98
(FIVE YEARS 6)

H-INDEX

11
(FIVE YEARS 1)

2021 ◽  
Vol 2021 ◽  
pp. 1-10
Author(s):  
Kang Zhao ◽  
Ling Xing ◽  
Honghai Wu

Among the algorithms used to assess user credibility in social networks, most of them quantify user information and then calculate the user credibility measure by linear summation. The algorithm above, however, ignores the aliasing of user credibility results under the linear summation dimension, resulting in a low evaluation accuracy. To solve this problem, we propose a user credibility evaluation method based on a soft-margin support-vector machine (SVM). This method transforms the user credibility evaluation dimension from a linear summation dimension to a plane coordinate dimension, which reduces the evaluation errors caused by user aliasing in the classification threshold interval. In the quantization of user information, the ladder assignment method is used to process the user text information and numeric information, and the weight assignment method of information entropy is used to calculate the weight assignment among different feature items, which reduces the errors caused by the inconsistency of the order of magnitude among different types of user information. Simulation results demonstrate the superiority of the proposed method in the user’s credibility evaluation results.



2021 ◽  
pp. 107705
Author(s):  
Alfredo Marín ◽  
Luisa I. Martínez-Merino ◽  
Justo Puerto ◽  
Antonio M. Rodríguez-Chía


2021 ◽  
Vol 219 ◽  
pp. 106897 ◽  
Author(s):  
Xinmin Tao ◽  
Wei Chen ◽  
Xiangke Li ◽  
Xiaohan Zhang ◽  
Yetong Li ◽  
...  


Author(s):  
Yingcheng Su ◽  
Yichao Wu ◽  
Zhenmao Li ◽  
Qiushan Guo ◽  
Ken Chen ◽  
...  
Keyword(s):  


Author(s):  
Huajun Wang ◽  
Yuanhai Shao ◽  
Shenglong Zhou ◽  
Ce Zhang ◽  
Naihua Xiu


2020 ◽  
Vol 2020 (1) ◽  
Author(s):  
Hidayat Ullah ◽  
Muhammad Adil Khan ◽  
Josip Pečarić

AbstractIn the present article, we elaborate on the notion to obtain bounds for the soft margin estimator of “Identification of Patient Zero in Static and Temporal Network-Robustness and Limitations”. To achieve these bounds for the soft margin estimator, we utilize the concavity of the Gaussian weighting function and well-known Jensen’s inequality. To acquire some more general bounds for the soft margin estimator, we consider some general functions defined on rectangles. We also use the behavior of the Jaccard similarity function to extract some handsome bounds for the soft margin estimator.



Quantum ◽  
2020 ◽  
Vol 4 ◽  
pp. 342
Author(s):  
Jonathan Allcock ◽  
Chang-Yu Hsieh

We propose a quantum algorithm for training nonlinear support vector machines (SVM) for feature space learning where classical input data is encoded in the amplitudes of quantum states. Based on the classical SVM-perf algorithm of Joachims \cite{joachims2006training}, our algorithm has a running time which scales linearly in the number of training examples m (up to polylogarithmic factors) and applies to the standard soft-margin ℓ1-SVM model. In contrast, while classical SVM-perf has demonstrated impressive performance on both linear and nonlinear SVMs, its efficiency is guaranteed only in certain cases: it achieves linear m scaling only for linear SVMs, where classification is performed in the original input data space, or for the special cases of low-rank or shift-invariant kernels. Similarly, previously proposed quantum algorithms either have super-linear scaling in m, or else apply to different SVM models such as the hard-margin or least squares ℓ2-SVM which lack certain desirable properties of the soft-margin ℓ1-SVM model. We classically simulate our algorithm and give evidence that it can perform well in practice, and not only for asymptotically large data sets.



2020 ◽  
Vol 39 (3) ◽  
pp. 4505-4513
Author(s):  
Guishan Dong ◽  
Xuewen Mu

The support vector machine is a classification approach in machine learning. The second-order cone optimization formulation for the soft-margin support vector machine can ensure that the misclassification rate of data points do not exceed a given value. In this paper, a novel second-order cone programming formulation is proposed for the soft-margin support vector machine. The novel formulation uses the l2-norm and two margin variables associated with each class to maximize the margin. Two regularization parameters α and β are introduced to control the trade-off between the maximization of margin variables. Numerical results illustrate that the proposed second-order cone programming formulation for the soft-margin support vector machine has a better prediction performance and robustness than other second-order cone programming support vector machine models used in this article for comparision.



Sign in / Sign up

Export Citation Format

Share Document