Uncertainty-safe large scale support vector machines

2017 ◽  
Vol 109 ◽  
pp. 215-230 ◽  
Author(s):  
Nicolas Couellan ◽  
Wenjuan Wang
2021 ◽  
Author(s):  
M. Tanveer ◽  
A. Tiwari ◽  
R. Choudhary ◽  
M. A. Ganaie

2016 ◽  
Vol 8 (1) ◽  
Author(s):  
Jonathan Alvarsson ◽  
Samuel Lampa ◽  
Wesley Schaal ◽  
Claes Andersson ◽  
Jarl E. S. Wikberg ◽  
...  

2017 ◽  
Vol 235 ◽  
pp. 199-209 ◽  
Author(s):  
Hakan Cevikalp ◽  
Vojtech Franc

2020 ◽  
Vol 10 (19) ◽  
pp. 6979
Author(s):  
Minho Ryu ◽  
Kichun Lee

Support vector machines (SVMs) are a well-known classifier due to their superior classification performance. They are defined by a hyperplane, which separates two classes with the largest margin. In the computation of the hyperplane, however, it is necessary to solve a quadratic programming problem. The storage cost of a quadratic programming problem grows with the square of the number of training sample points, and the time complexity is proportional to the cube of the number in general. Thus, it is worth studying how to reduce the training time of SVMs without compromising the performance to prepare for sustainability in large-scale SVM problems. In this paper, we proposed a novel data reduction method for reducing the training time by combining decision trees and relative support distance. We applied a new concept, relative support distance, to select good support vector candidates in each partition generated by the decision trees. The selected support vector candidates improved the training speed for large-scale SVM problems. In experiments, we demonstrated that our approach significantly reduced the training time while maintaining good classification performance in comparison with existing approaches.


Sign in / Sign up

Export Citation Format

Share Document