The support vector machine learning using the second order cone programming

Author(s):  
R. Debnath ◽  
M. Muramatsu ◽  
H. Takahashi
2020 ◽  
Vol 39 (3) ◽  
pp. 4505-4513
Author(s):  
Guishan Dong ◽  
Xuewen Mu

The support vector machine is a classification approach in machine learning. The second-order cone optimization formulation for the soft-margin support vector machine can ensure that the misclassification rate of data points do not exceed a given value. In this paper, a novel second-order cone programming formulation is proposed for the soft-margin support vector machine. The novel formulation uses the l2-norm and two margin variables associated with each class to maximize the margin. Two regularization parameters α and β are introduced to control the trade-off between the maximization of margin variables. Numerical results illustrate that the proposed second-order cone programming formulation for the soft-margin support vector machine has a better prediction performance and robustness than other second-order cone programming support vector machine models used in this article for comparision.


Quantum ◽  
2021 ◽  
Vol 5 ◽  
pp. 427
Author(s):  
Iordanis Kerenidis ◽  
Anupam Prakash ◽  
Dániel Szilágyi

We present a quantum interior-point method (IPM) for second-order cone programming (SOCP) that runs in time O~(nrζκδ2log⁡(1/ϵ)) where r is the rank and n the dimension of the SOCP, δ bounds the distance of intermediate solutions from the cone boundary, ζ is a parameter upper bounded by n, and κ is an upper bound on the condition number of matrices arising in the classical IPM for SOCP. The algorithm takes as its input a suitable quantum description of an arbitrary SOCP and outputs a classical description of a δ-approximate ϵ-optimal solution of the given problem.Furthermore, we perform numerical simulations to determine the values of the aforementioned parameters when solving the SOCP up to a fixed precision ϵ. We present experimental evidence that in this case our quantum algorithm exhibits a polynomial speedup over the best classical algorithms for solving general SOCPs that run in time O(nω+0.5) (here, ω is the matrix multiplication exponent, with a value of roughly 2.37 in theory, and up to 3 in practice). For the case of random SVM (support vector machine) instances of size O(n), the quantum algorithm scales as O(nk), where the exponent k is estimated to be 2.59 using a least-squares power law. On the same family random instances, the estimated scaling exponent for an external SOCP solver is 3.31 while that for a state-of-the-art SVM solver is 3.11.


Sign in / Sign up

Export Citation Format

Share Document