TWO-STEP SINGLE PARAMETER REGULARIZATION FISHER DISCRIMINANT METHOD FOR FACE RECOGNITION

Author(s):  
WEN-SHENG CHEN ◽  
PONG CHI YUEN ◽  
JIAN HUANG ◽  
BIN FANG

In face recognition tasks, Fisher discriminant analysis (FDA) is one of the promising methods for dimensionality reduction and discriminant feature extraction. The objective of FDA is to find an optimal projection matrix, which maximizes the between-class-distance and simultaneously minimizes within-class-distance. The main limitation of traditional FDA is the so-called Small Sample Size (3S) problem. It induces that the within-class scatter matrix is singular and then the traditional FDA fails to perform directly for pattern classification. To overcome 3S problem, this paper proposes a novel two-step single parameter regularization Fisher discriminant (2SRFD) algorithm for face recognition. The first semi-regularized step is based on a rank lifting theorem. This step adjusts both the projection directions and their corresponding weights. Our previous three-to-one parameter regularized technique is exploited in the second stage, which just changes the weights of projection directions. It is shown that the final regularized within-class scatter matrix approaches the original within-class scatter matrix as the single parameter tends to zero. Also, our method has good computational complexity. The proposed method has been tested and evaluated with three public available databases, namely ORL, CMU PIE and FERET face databases. Comparing with existing state-of-the-art FDA-based methods in solving the S3 problem, the proposed 2SRFD approach gives the best performance.

2011 ◽  
Vol 317-319 ◽  
pp. 150-153
Author(s):  
Wan Li Feng ◽  
Shang Bing Gao

In this paper, a reformative scatter difference discriminant criterion (SDDC) with fuzzy set theory is studied. The scatter difference between between-class and within-class as discriminant criterion is effective to overcome the singularity problem of the within-class scatter matrix due to small sample size problem occurred in classical Fisher discriminant analysis. However, the conventional SDDC assumes the same level of relevance of each sample to the corresponding class. So, a fuzzy maximum scatter difference analysis (FMSDA) algorithm is proposed, in which the fuzzy k-nearest neighbor (FKNN) is implemented to achieve the distribution information of original samples, and this information is utilized to redefine corresponding scatter matrices which are different to the conventional SDDC and effective to extract discriminative features from overlapping (outlier) samples. Experiments conducted on FERET face databases demonstrate the effectiveness of the proposed method.


Author(s):  
WEN-SHENG CHEN ◽  
PONG C. YUEN ◽  
JIAN HUANG

This paper presents a new regularization technique to deal with the small sample size (S3) problem in linear discriminant analysis (LDA) based face recognition. Regularization on the within-class scatter matrix Sw has been shown to be a good direction for solving the S3 problem because the solution is found in full space instead of a subspace. The main limitation in regularization is that a very high computation is required to determine the optimal parameters. In view of this limitation, this paper re-defines the three-parameter regularization on the within-class scatter matrix [Formula: see text], which is suitable for parameter reduction. Based on the new definition of [Formula: see text], we derive a single parameter (t) explicit expression formula for determining the three parameters and develop a one-parameter regularization on the within-class scatter matrix. A simple and efficient method is developed to determine the value of t. It is also proven that the new regularized within-class scatter matrix [Formula: see text] approaches the original within-class scatter matrix Sw as the single parameter tends to zero. A novel one-parameter regularization linear discriminant analysis (1PRLDA) algorithm is then developed. The proposed 1PRLDA method for face recognition has been evaluated with two public available databases, namely ORL and FERET databases. The average recognition accuracies of 50 runs for ORL and FERET databases are 96.65% and 94.00%, respectively. Comparing with existing LDA-based methods in solving the S3 problem, the proposed 1PRLDA method gives the best performance.


Author(s):  
WEN-SHENG CHEN ◽  
JIAN HUANG ◽  
JIN ZOU ◽  
BIN FANG

Linear Discriminant Analysis (LDA) is a popular statistical method for both feature extraction and dimensionality reduction in face recognition. The major drawback of LDA is the so-called small sample size (3S) problem. This problem always occurs when the total number of training samples is smaller than the dimension of feature space. Under this situation, the within-class scatter matrix Sw becomes singular and LDA approach cannot be implemented directly. To overcome the 3S problem, this paper proposes a novel wavelet-face based subspace LDA algorithm. Wavelet-face feature extraction and dimensionality reduction are based on two-level D4-filter wavelet transform and discarding the null space of total class scatter matrix St. It is shown that our obtained projection matrix satisfies the uncorrelated constraint conditions. Hence in the sense of statistical uncorrelation, this projection matrix is optimal. The proposed method for face recognition has been evaluated with two public available databases, namely ORL and FERET databases. Comparing with existing LDA-based methods to solve the 3S problem, our method gives the best performance.


Author(s):  
HAITAO ZHAO ◽  
PONG C. YUEN ◽  
JINGYU YANG

Fisher Linear Discriminant Analysis (LDA) has been successfully used as a data discriminantion technique for face recognition. This paper has developed a novel subspace approach in determining the optimal projection. This algorithm effectively solves the small sample size problem and eliminates the possibility of losing discriminative information. Through the theoretical derivation, we compared our method with the typical PCA-based LDA methods, and also showed the relationship between our new method and perturbation-based method. The feasibility of the new algorithm has been demonstrated by comprehensive evaluation and comparison experiments with existing LDA-based methods.


2020 ◽  
Vol 11 (2) ◽  
pp. 118-133
Author(s):  
Souheila Benkhaira ◽  
Abdesslem Layeb

Regularized-LDA (R-LDA) is one of the most successful holistic approaches that is introduced to overcome the “small sample size” (SSS) problem of the LDA method, which is often encountered in Face Recognition (FR) tasks. R-LDA is based on reducing the high variance of principal components of the within-class scatter matrix to optimize the regularized Fisher's criterion. In this article, the authors assume that some of these components do not have significant information and they can be discarded. To this end, the authors propose CS-RLDA that uses a Cuckoo search (CS) algorithm to select the optimal eigenvectors from a within-class matrix. However, the CS algorithm has a slow convergence speed. To deal with this problem, and to create more diversity and better trade-off between exploitation and exploration around the best solutions, the authors have modified the basic cuckoo algorithm by using a mutation operator. The experimental results performed on the ORL and UMIST databases indicate that the proposed method enhances the performance of FR.


2014 ◽  
Vol 889-890 ◽  
pp. 1065-1068
Author(s):  
Yu’e Lin ◽  
Xing Zhu Liang ◽  
Hua Ping Zhou

In the recent years, the feature extraction algorithms based on manifold learning, which attempt to project the original data into a lower dimensional feature space by preserving the local neighborhood structure, have drawn much attention. Among them, the Marginal Fisher Analysis (MFA) achieved high performance for face recognition. However, MFA suffers from the small sample size problems and is still a linear technique. This paper develops a new nonlinear feature extraction algorithm, called Kernel Null Space Marginal Fisher Analysis (KNSMFA). KNSMFA based on a new optimization criterion is presented, which means that all the discriminant vectors can be calculated in the null space of the within-class scatter. KNSMFA not only exploits the nonlinear features but also overcomes the small sample size problems. Experimental results on ORL database indicate that the proposed method achieves higher recognition rate than the MFA method and some existing kernel feature extraction algorithms.


2021 ◽  
Vol 25 (5) ◽  
pp. 1273-1290
Author(s):  
Shuangxi Wang ◽  
Hongwei Ge ◽  
Jinlong Yang ◽  
Shuzhi Su

It is an open question to learn an over-complete dictionary from a limited number of face samples, and the inherent attributes of the samples are underutilized. Besides, the recognition performance may be adversely affected by the noise (and outliers), and the strict binary label based linear classifier is not appropriate for face recognition. To solve above problems, we propose a virtual samples based robust block-diagonal dictionary learning for face recognition. In the proposed model, the original samples and virtual samples are combined to solve the small sample size problem, and both the structure constraint and the low rank constraint are exploited to preserve the intrinsic attributes of the samples. In addition, the fidelity term can effectively reduce negative effects of noise (and outliers), and the ε-dragging is utilized to promote the performance of the linear classifier. Finally, extensive experiments are conducted in comparison with many state-of-the-art methods on benchmark face datasets, and experimental results demonstrate the efficacy of the proposed method.


Sign in / Sign up

Export Citation Format

Share Document