Improved principal component analysis and linear regression classification for face recognition

2018 ◽  
Vol 145 ◽  
pp. 175-182 ◽  
Author(s):  
Yani Zhu ◽  
Chaoyang Zhu ◽  
Xiaoxin Li
2017 ◽  
Vol 2017 ◽  
pp. 1-9 ◽  
Author(s):  
Tai-Xiang Jiang ◽  
Ting-Zhu Huang ◽  
Xi-Le Zhao ◽  
Tian-Hui Ma

We have proposed a patch-based principal component analysis (PCA) method to deal with face recognition. Many PCA-based methods for face recognition utilize the correlation between pixels, columns, or rows. But the local spatial information is not utilized or not fully utilized in these methods. We believe that patches are more meaningful basic units for face recognition than pixels, columns, or rows, since faces are discerned by patches containing eyes and noses. To calculate the correlation between patches, face images are divided into patches and then these patches are converted to column vectors which would be combined into a new “image matrix.” By replacing the images with the new “image matrix” in the two-dimensional PCA framework, we directly calculate the correlation of the divided patches by computing the total scatter. By optimizing the total scatter of the projected samples, we obtain the projection matrix for feature extraction. Finally, we use the nearest neighbor classifier. Extensive experiments on the ORL and FERET face database are reported to illustrate the performance of the patch-based PCA. Our method promotes the accuracy compared to one-dimensional PCA, two-dimensional PCA, and two-directional two-dimensional PCA.


Author(s):  
Peter Hall

This article discusses the methodology and theory of principal component analysis (PCA) for functional data. It first provides an overview of PCA in the context of finite-dimensional data and infinite-dimensional data, focusing on functional linear regression, before considering the applications of PCA for functional data analysis, principally in cases of dimension reduction. It then describes adaptive methods for prediction and weighted least squares in functional linear regression. It also examines the role of principal components in the assessment of density for functional data, showing how principal component functions are linked to the amount of probability mass contained in a small ball around a given, fixed function, and how this property can be used to define a simple, easily estimable density surrogate. The article concludes by explaining the use of PCA for estimating log-density.


Sign in / Sign up

Export Citation Format

Share Document