Improved Generalized Eigenvalue Proximal Support Vector Machine

2013 ◽  
Vol 20 (3) ◽  
pp. 213-216 ◽  
Author(s):  
Yuan-Hai Shao ◽  
Nai-Yang Deng ◽  
Wei-Jie Chen ◽  
Zhen Wang
2016 ◽  
Vol 9 (6) ◽  
pp. 1041-1054 ◽  
Author(s):  
Jun Liang ◽  
Fei-yun Zhang ◽  
Xiao-xia Xiong ◽  
Xiao-bo Chen ◽  
Long Chen ◽  
...  

Symmetry ◽  
2021 ◽  
Vol 13 (5) ◽  
pp. 833
Author(s):  
Yuanyuan Chen ◽  
Zhixia Yang

Functional data analysis has become a research hotspot in the field of data mining. Traditional data mining methods regard functional data as a discrete and limited observation sequence, ignoring the continuity. In this paper, the functional data classification is addressed, proposing a functional generalized eigenvalue proximal support vector machine (FGEPSVM). Specifically, we find two nonparallel hyperplanes in function space, a positive functional hyperplane, and a functional negative hyperplane. The former is closest to the positive functional data and furthest from the negative functional data, while the latter has the opposite properties. By introducing the orthonormal basis, the problem in function space is transformed into the ones in vector space. It should be pointed out that the higher-order derivative information is applied from two aspects. We apply the derivatives alone or the weighted linear combination of the original function and the derivatives. It can be expected that to improve the classification accuracy by using more data information. Experiments on artificial datasets and benchmark datasets show the effectiveness of our FGEPSVM for functional data classification.


Sign in / Sign up

Export Citation Format

Share Document