Convergence Improvement of Active Set Training for Support Vector Regressors

Author(s):  
Shigeo Abe ◽  
Ryousuke Yabuwaki
Keyword(s):  
2021 ◽  
Vol 40 (1) ◽  
pp. 1481-1494
Author(s):  
Geng Deng ◽  
Yaoguo Xie ◽  
Xindong Wang ◽  
Qiang Fu

Many classification problems contain shape information from input features, such as monotonic, convex, and concave. In this research, we propose a new classifier, called Shape-Restricted Support Vector Machine (SR-SVM), which takes the component-wise shape information to enhance classification accuracy. There exists vast research literature on monotonic classification covering monotonic or ordinal shapes. Our proposed classifier extends to handle convex and concave types of features, and combinations of these types. While standard SVM uses linear separating hyperplanes, our novel SR-SVM essentially constructs non-parametric and nonlinear separating planes subject to component-wise shape restrictions. We formulate SR-SVM classifier as a convex optimization problem and solve it using an active-set algorithm. The approach applies basis function expansions on the input and effectively utilizes the standard SVM solver. We illustrate our methodology using simulation and real world examples, and show that SR-SVM improves the classification performance with additional shape information of input.


2015 ◽  
Vol 154 ◽  
pp. 296-304 ◽  
Author(s):  
Bing Li ◽  
Shiji Song ◽  
Wei Liu ◽  
Keyou You

2004 ◽  
Vol 15 (2) ◽  
pp. 268-275 ◽  
Author(s):  
D.R. Musicant ◽  
A. Feinberg

Author(s):  
Bin Gu ◽  
Yingying Shan ◽  
Xiang Geng ◽  
Guansheng Zheng

Support vector machines play an important role in machine learning in the last two decades. Traditional SVM solvers (e.g. LIBSVM) are not scalable in the current big data era. Recently, a state of the art solver was proposed based on the asynchronous greedy coordinate descent (AsyGCD) algorithm. However, AsyGCD is still not scalable enough, and is limited to binary classification. To address these issues, in this paper we propose an asynchronous accelerated greedy coordinate descent algorithm (AsyAGCD) for  SVMs. Compared with AsyGCD, our AsyAGCD has the following two-fold advantages: 1) our AsyAGCD is an accelerated version of AsyGCD because active set strategy is used. Specifically, our AsyAGCD can converge much faster  than AsyGCD for the second half of iterations. 2) Our AsyAGCD can handle more SVM formulations (including binary classification and regression SVMs) than AsyGCD. We provide the comparison of computational complexity of AsyGCD and our AsyAGCD. Experiment results on a variety of datasets and learning applications confirm that our AsyAGCD is much faster than the existing SVM solvers (including AsyGCD).


Sign in / Sign up

Export Citation Format

Share Document