Cutting L1-Norm Distance Discriminant Analysis with Sample Reconstruction
Dimensionality reduction plays an important role in the fields of pattern recognition and computer vision. Recursive discriminative subspace learning with an L1-norm distance constraint (RDSL) is proposed to robustly extract features from contaminated data and L1-norm and slack variables are utilized for accomplishing the goal. However, its performance may decline when too many outliers are available. Moreover, the method ignores the global structure of the data. In this paper, we propose cutting L1-norm distance discriminant analysis with sample reconstruction (C-L1-DDA) to solve the two problems. We apply cutting L1-norm to measure within-class and between-class distances and thus outliers may be strongly suppressed. Moreover, we use cutting squared L2-norm to measure reconstruction errors. In this way, outliers may be constrained and the global structure of data may be approximately preserved. Finally, we give an alternating iterative algorithm to extract feature vectors. Experimental results on two publicly available real databases verify the feasibility and effectiveness of the proposed method.