A comparative study of multi-class support vector machines in the unifying framework of large margin classifiers

2005 ◽  
Vol 21 (2) ◽  
pp. 199-214 ◽  
Author(s):  
Yann Guermeur ◽  
Andr� Elisseeff ◽  
Dominique Zelus
2009 ◽  
Vol 18 (1-2) ◽  
pp. 227-245 ◽  
Author(s):  
NATI LINIAL ◽  
ADI SHRAIBMAN

This paper has two main focal points. We first consider an important class of machine learning algorithms: large margin classifiers, such as Support Vector Machines. The notion of margin complexity quantifies the extent to which a given class of functions can be learned by large margin classifiers. We prove that up to a small multiplicative constant, margin complexity is equal to the inverse of discrepancy. This establishes a strong tie between seemingly very different notions from two distinct areas.In the same way that matrix rigidity is related to rank, we introduce the notion of rigidity of margin complexity. We prove that sign matrices with small margin complexity rigidity are very rare. This leads to the question of proving lower bounds on the rigidity of margin complexity. Quite surprisingly, this question turns out to be closely related to basic open problems in communication complexity, e.g., whether PSPACE can be separated from the polynomial hierarchy in communication complexity.Communication is a key ingredient in many types of learning. This explains the relations between the field of learning theory and that of communication complexity [6, l0, 16, 26]. The results of this paper constitute another link in this rich web of relations. These new results have already been applied toward the solution of several open problems in communication complexity [18, 20, 29].


2017 ◽  
Vol 29 (11) ◽  
pp. 3078-3093 ◽  
Author(s):  
Liangzhi Chen ◽  
Haizhang Zhang

Support vector machines, which maximize the margin from patterns to the separation hyperplane subject to correct classification, have received remarkable success in machine learning. Margin error bounds based on Hilbert spaces have been introduced in the literature to justify the strategy of maximizing the margin in SVM. Recently, there has been much interest in developing Banach space methods for machine learning. Large margin classification in Banach spaces is a focus of such attempts. In this letter we establish a margin error bound for the SVM on reproducing kernel Banach spaces, thus supplying statistical justification for large-margin classification in Banach spaces.


2016 ◽  
Vol 28 (6) ◽  
pp. 1217-1247 ◽  
Author(s):  
Yunlong Feng ◽  
Yuning Yang ◽  
Xiaolin Huang ◽  
Siamak Mehrkanoon ◽  
Johan A. K. Suykens

This letter addresses the robustness problem when learning a large margin classifier in the presence of label noise. In our study, we achieve this purpose by proposing robustified large margin support vector machines. The robustness of the proposed robust support vector classifiers (RSVC), which is interpreted from a weighted viewpoint in this work, is due to the use of nonconvex classification losses. Besides the robustness, we also show that the proposed RSCV is simultaneously smooth, which again benefits from using smooth classification losses. The idea of proposing RSVC comes from M-estimation in statistics since the proposed robust and smooth classification losses can be taken as one-sided cost functions in robust statistics. Its Fisher consistency property and generalization ability are also investigated. Besides the robustness and smoothness, another nice property of RSVC lies in the fact that its solution can be obtained by solving weighted squared hinge loss–based support vector machine problems iteratively. We further show that in each iteration, it is a quadratic programming problem in its dual space and can be solved by using state-of-the-art methods. We thus propose an iteratively reweighted type algorithm and provide a constructive proof of its convergence to a stationary point. Effectiveness of the proposed classifiers is verified on both artificial and real data sets.


2020 ◽  
Vol 146 (6) ◽  
pp. 04020010 ◽  
Author(s):  
Afshin Ashrafzadeh ◽  
Ozgur Kişi ◽  
Pouya Aghelpour ◽  
Seyed Mostafa Biazar ◽  
Mohammadreza Askarizad Masouleh

Sign in / Sign up

Export Citation Format

Share Document