A Hybrid Dimension Reduction Based Linear Discriminant Analysis for Classification of High-Dimensional Data

Author(s):  
Ezgi Zorarpaci
2012 ◽  
Vol 42 (2) ◽  
pp. 209-231 ◽  
Author(s):  
Masashi Hyodo ◽  
Takayuki Yamada ◽  
Tetsuo Himeno ◽  
Takashi Seo

2019 ◽  
Vol 2019 ◽  
pp. 1-10 ◽  
Author(s):  
Zhibo Guo ◽  
Ying Zhang

It is very difficult to process and analyze high-dimensional data directly. Therefore, it is necessary to learn a potential subspace of high-dimensional data through excellent dimensionality reduction algorithms to preserve the intrinsic structure of high-dimensional data and abandon the less useful information. Principal component analysis (PCA) and linear discriminant analysis (LDA) are two popular dimensionality reduction methods for high-dimensional sensor data preprocessing. LDA contains two basic methods, namely, classic linear discriminant analysis and FS linear discriminant analysis. In this paper, a new method, called similar distribution discriminant analysis (SDDA), is proposed based on the similarity of samples’ distribution. Furthermore, the method of solving the optimal discriminant vector is given. These discriminant vectors are orthogonal and nearly statistically uncorrelated. The disadvantages of PCA and LDA are overcome, and the extracted features are more effective by using SDDA. The recognition performance of SDDA exceeds PCA and LDA largely. Some experiments on the Yale face database, FERET face database, and UCI multiple features dataset demonstrate that the proposed method is effective. The results reveal that SDDA obtains better performance than comparison dimensionality reduction methods.


2011 ◽  
Vol 39 (2) ◽  
pp. 1241-1265 ◽  
Author(s):  
Jun Shao ◽  
Yazhen Wang ◽  
Xinwei Deng ◽  
Sijian Wang

Sign in / Sign up

Export Citation Format

Share Document