scholarly journals Formulations of Support Vector Machines: A Note from an Optimization Point of View

2001 ◽  
Vol 13 (2) ◽  
pp. 307-317 ◽  
Author(s):  
Chih-Jen Lin

In this article, we discuss issues about formulations of support vector machines (SVM) from an optimization point of view. First, SVMs map training data into a higher- (maybe infinite-) dimensional space. Currently primal and dual formulations of SVM are derived in the finite dimensional space and readily extend to the infinite-dimensional space. We rigorously discuss the primal-dual relation in the infinite-dimensional spaces. Second, SVM formulations contain penalty terms, which are different from unconstrained penalty functions in optimization. Traditionally unconstrained penalty functions approximate a constrained problem as the penalty parameter increases. We are interested in similar properties for SVM formulations. For two of the most popular SVM formulations, we show that one enjoys properties of exact penalty functions, but the other is only like traditional penalty functions, which converge when the penalty parameter goes to infinity.

Author(s):  
Sadaaki Miyamoto ◽  
◽  
Youichi Nakayama ◽  

We discuss hard c-means clustering using a mapping into a high-dimensional space considered within the theory of support vector machines. Two types of iterative algorithms are developed. Effectiveness of the proposed method is shown by numerical examples.


2003 ◽  
Vol 16 (3) ◽  
pp. 305-316
Author(s):  
Srdjan Stankovic ◽  
Milos Stankovic ◽  
Maja Stankovic ◽  
Milan Milosavljevic

Learning from data is discussed from the point of view of support vector machines. A specific algorithm solving the polychotomy problem is described. The methodology is illustrated on two complex examples taken from practice.


2007 ◽  
Vol 19 (5) ◽  
pp. 1155-1178 ◽  
Author(s):  
Olivier Chapelle

Most literature on support vector machines (SVMs) concentrates on the dual optimization problem. In this letter, we point out that the primal problem can also be solved efficiently for both linear and nonlinear SVMs and that there is no reason for ignoring this possibility. On the contrary, from the primal point of view, new families of algorithms for large-scale SVM training can be investigated.


Author(s):  
Sadaaki Miyamoto ◽  
◽  
Daisuke Suizu ◽  

We studied clustering algorithms of fuzzy c-means using a kernel to represent an inner product for mapping into high-dimensional space. Such kernels have been studied in support vector machines used by many researchers in pattern classification. Algorithms of fuzzy c-means are transformed into kernel-based methods by changing objective functions, whereby new iterative minimization algorithms are derived. Numerical examples show that clusters that cannot be obtained without a kernel are generated.


Sign in / Sign up

Export Citation Format

Share Document