scholarly journals Improving the Generalization Error Bound using Total margin in Support Vector Machines

2004 ◽  
Vol 17 (1) ◽  
pp. 75-88
2004 ◽  
Vol 13 (04) ◽  
pp. 791-800 ◽  
Author(s):  
HOLGER FRÖHLICH ◽  
OLIVIER CHAPELLE ◽  
BERNHARD SCHÖLKOPF

The problem of feature selection is a difficult combinatorial task in Machine Learning and of high practical relevance, e.g. in bioinformatics. Genetic Algorithms (GAs) offer a natural way to solve this problem. In this paper we present a special Genetic Algorithm, which especially takes into account the existing bounds on the generalization error for Support Vector Machines (SVMs). This new approach is compared to the traditional method of performing cross-validation and to other existing algorithms for feature selection.


2018 ◽  
Vol 28 (4) ◽  
pp. 705-717 ◽  
Author(s):  
Pittipol Kantavat ◽  
Boonserm Kijsirikul ◽  
Patoomsiri Songsiri ◽  
Ken-Ichi Fukui ◽  
Masayuki Numao

Abstract We propose new methods for support vector machines using a tree architecture for multi-class classification. In each node of the tree, we select an appropriate binary classifier, using entropy and generalization error estimation, then group the examples into positive and negative classes based on the selected classifier, and train a new classifier for use in the classification phase. The proposed methods can work in time complexity between O(log2 N) and O(N), where N is the number of classes. We compare the performance of our methods with traditional techniques on the UCI machine learning repository using 10-fold cross-validation. The experimental results show that the methods are very useful for problems that need fast classification time or those with a large number of classes, since the proposed methods run much faster than the traditional techniques but still provide comparable accuracy.


2018 ◽  
Author(s):  
Nelson Marcelo Romero Aquino ◽  
Matheus Gutoski ◽  
Leandro Takeshi Hattori ◽  
Heitor Silvério Lopes

Sign in / Sign up

Export Citation Format

Share Document