Sparsity Based Feature Extraction for Kernel Minimum Squared Error

Author(s):  
Jiang Jiang ◽  
Xi Chen ◽  
Haitao Gan ◽  
Nong Sang
2014 ◽  
Vol 536-537 ◽  
pp. 450-453 ◽  
Author(s):  
Jiang Jiang ◽  
Xi Chen ◽  
Hai Tao Gan

In this paper, a sparsity based model is proposed for feature selection in kernel minimum squared error (KMSE). By imposing a sparsity shrinkage term, we formulate the procedure of subset selection as an optimization problem. With the chosen small portion of training examples, the computational burden of feature extraction is largely alleviated. Experimental results conducted on several benchmark datasets indicate the effectivity and efficiency of our method.


Author(s):  
David Zhang ◽  
Fengxi Song ◽  
Yong Xu ◽  
Zhizhen Liang

As mentioned in Chapter II, there are two kinds of LDA approaches: classification- oriented LDA and feature extraction-oriented LDA. In most chapters of this session of the book, we focus our attention on the feature extraction aspect of LDA for SSS problems. On the other hand,, with this chapter we present our studies on the pattern classification aspect of LDA for SSS problems. In this chapter, we present three novel classification-oriented linear discriminant criteria. The first one is large margin linear projection (LMLP) which makes full use of the characteristic of the SSS problems. The second one is the minimum norm minimum squared-error criterion which is a modification of the minimum squared-error discriminant criterion. The third one is the maximum scatter difference which is a modification of the Fisher discriminant criterion.


2016 ◽  
Vol 171 ◽  
pp. 149-155 ◽  
Author(s):  
Haitao Gan ◽  
Rui Huang ◽  
Zhizeng Luo ◽  
Yingle Fan ◽  
Farong Gao

Sign in / Sign up

Export Citation Format

Share Document