A new method of training samples choose for support vector machines

Author(s):  
Dezheng Zhu ◽  
Jiafu Jiang
2013 ◽  
Vol 2013 ◽  
pp. 1-7 ◽  
Author(s):  
Xigao Shao ◽  
Kun Wu ◽  
Bifeng Liao

Working set selection is a major step in decomposition methods for training least squares support vector machines (LS-SVMs). In this paper, a new technique for the selection of working set in sequential minimal optimization- (SMO-) type decomposition methods is proposed. By the new method, we can select a single direction to achieve the convergence of the optimality condition. A simple asymptotic convergence proof for the new algorithm is given. Experimental comparisons demonstrate that the classification accuracy of the new method is not largely different from the existing methods, but the training speed is faster than existing ones.


2011 ◽  
Vol 2011 ◽  
pp. 1-6 ◽  
Author(s):  
Masaaki Tsujitani ◽  
Yusuke Tanaka

This paper considers the applications of resampling methods to support vector machines (SVMs). We take into account the leaving-one-out cross-validation (CV) when determining the optimum tuning parameters and bootstrapping the deviance in order to summarize the measure of goodness-of-fit in SVMs. The leaving-one-out CV is also adapted in order to provide estimates of the bias of the excess error in a prediction rule constructed with training samples. We analyze the data from a mackerel-egg survey and a liver-disease study.


2015 ◽  
Vol 36 (13) ◽  
pp. 3331-3344 ◽  
Author(s):  
Xiaoxia Sun ◽  
Liwei Li ◽  
Bing Zhang ◽  
Dongmei Chen ◽  
Lianru Gao

2012 ◽  
Vol 59 (3) ◽  
pp. 1397-1408 ◽  
Author(s):  
Ernesto Vazquez-Sanchez ◽  
Jaime Gomez-Gil ◽  
José Carlos Gamazo-Real ◽  
José Fernando Diez-Higuera

2011 ◽  
Vol 204-210 ◽  
pp. 879-882
Author(s):  
Kai Li ◽  
Xiao Xia Lu

By combining fuzzy support vector machine with rough set, we propose a rough margin based fuzzy support vector machine (RFSVM). It inherits the characteristic of the FSVM method and considers position of training samples of the rough margin in order to reduce overfitting due to noises or outliers. The new proposed algorithm finds the optimal separating hyperplane that maximizes the rough margin containing lower margin and upper margin. Meanwhile, the points lied on the lower margin have larger penalty than these in the boundary of the rough margin. Experiments on several benchmark datasets show that the RFSVM algorithm is effective and feasible compared with the existing support vector machines.


Sign in / Sign up

Export Citation Format

Share Document