Training data reduction and nonlinear feature extraction in classification based on greedy Generalized Discriminant Analysis

Author(s):  
Chun Yang ◽  
Xiaofang Liu
2013 ◽  
Vol 347-350 ◽  
pp. 2390-2394
Author(s):  
Xiao Fang Liu ◽  
Chun Yang

Nonlinear feature extraction used standard Kernel Principal Component Analysis (KPCA) method has large memories and high computational complexity in large datasets. A Greedy Kernel Principal Component Analysis (GKPCA) method is applied to reduce training data and deal with the nonlinear feature extraction problem for training data of large data in classification. First, a subset, which approximates to the original training data, is selected from the full training data using the greedy technique of the GKPCA method. Then, the feature extraction model is trained by the subset instead of the full training data. Finally, FCM algorithm classifies feature extraction data of the GKPCA, KPCA and PCA methods, respectively. The simulation results indicate that the feature extraction performance of both the GKPCA, and KPCA methods outperform the PCA method. In addition of retaining the performance of the KPCA method, the GKPCA method reduces computational complexity due to the reduced training set in classification.


2011 ◽  
Vol 339 ◽  
pp. 571-574
Author(s):  
Xing Zhu Liang ◽  
Jing Zhao Li ◽  
Yu E Lin

Several orthogonal feature extraction algorithms based on local preserving projection have recently been proposed. However, these methods still are linear techniques in nature. In this paper, we present nonlinear feature extraction method called Kernel Orthogonal Neighborhood Preserving Discriminant Analysis (KONPDA). A major advantage of the proposed method is that it is regarded every column of the kernel matrix as a corresponding sample. Then running KONPDA in kernel matrix, nonlinear features can be extracted. Experimental results on ORL database indicate that the proposed KONPDA method achieves higher recognition rate than the ONPDA method and other kernel-based learning algorithms.


Open Physics ◽  
2017 ◽  
Vol 15 (1) ◽  
pp. 270-279
Author(s):  
Quanbao Li ◽  
Fajie Wei ◽  
Shenghan Zhou

AbstractThe linear discriminant analysis (LDA) is one of popular means for linear feature extraction. It usually performs well when the global data structure is consistent with the local data structure. Other frequently-used approaches of feature extraction usually require linear, independence, or large sample condition. However, in real world applications, these assumptions are not always satisfied or cannot be tested. In this paper, we introduce an adaptive method, local kernel nonparametric discriminant analysis (LKNDA), which integrates conventional discriminant analysis with nonparametric statistics. LKNDA is adept in identifying both complex nonlinear structures and the ad hoc rule. Six simulation cases demonstrate that LKNDA have both parametric and nonparametric algorithm advantages and higher classification accuracy. Quartic unilateral kernel function may provide better robustness of prediction than other functions. LKNDA gives an alternative solution for discriminant cases of complex nonlinear feature extraction or unknown feature extraction. At last, the application of LKNDA in the complex feature extraction of financial market activities is proposed.


Sign in / Sign up

Export Citation Format

Share Document