scholarly journals Enhancing Optical Correlation Decision Performance for Face Recognition by Using a Nonparametric Kernel Smoothing Classification

Sensors ◽  
2019 ◽  
Vol 19 (23) ◽  
pp. 5092
Author(s):  
Matthieu Saumard ◽  
Marwa Elbouz ◽  
Michaël Aron ◽  
Ayman Alfalou ◽  
Christian Brosseau

Optical correlation has a rich history in image recognition applications from a database. In practice, it is simple to implement optically using two lenses or numerically using two Fourier transforms. Even if correlation is a reliable method for image recognition, it may jeopardize decision making according to the location, height, and shape of the correlation peak within the correlation plane. Additionally, correlation is very sensitive to image rotation and scale. To overcome these issues, in this study, we propose a method of nonparametric modelling of the correlation plane. Our method is based on a kernel estimation of the regression function used to classify the individual images in the correlation plane. The basic idea is to improve the decision by taking into consideration the energy shape and distribution in the correlation plane. The method relies on the calculation of the Hausdorff distance between the target correlation plane (of the image to recognize) and the correlation planes obtained from the database (the correlation planes computed from the database images). Our method is tested for a face recognition application using the Pointing Head Pose Image Database (PHPID) database. Overall, the results demonstrate good performances of this method compared to competitive methods in terms of good detection and very low false alarm rates.

Author(s):  
Ryo Okui ◽  
Takahide Yanagi

Abstract This paper proposes nonparametric kernel-smoothing estimation for panel data to examine the degree of heterogeneity across cross-sectional units. We first estimate the sample mean, autocovariances and autocorrelations for each unit and then apply kernel smoothing to compute their density functions. The dependence of the kernel estimator on bandwidth makes asymptotic bias of very high order affect the required condition on the relative magnitudes of the cross-sectional sample size (N) and the time-series length (T). In particular, it makes the condition on N and T stronger and more complicated than those typically observed in the long-panel literature without kernel smoothing. We also consider a split-panel jackknife method to correct bias and construction of confidence intervals. An empirical application illustrates our procedure.


2017 ◽  
Vol 7 (1.1) ◽  
pp. 213
Author(s):  
Sheela Rani ◽  
Vuyyuru Tejaswi ◽  
Bonthu Rohitha ◽  
Bhimavarapu Akhil

Recognition of face has been turned out to be the most important and interesting area in research. A face recognition framework is a PC application that is apt for recognizing or confirming the presence of human face from a computerized picture, from the video frames etc. One of the approaches to do this is by matching the chosen facial features with the pictures in the database. It is normally utilized as a part of security frameworks and can be implemented in different biometrics, for example, unique finger impression or eye iris acknowledgment frameworks. A picture is a mix of edges. The curved line potions where the brightness of the image change intensely are known as edges. We utilize a similar idea in the field of face-detection, the force of facial colours are utilized as a consistent value. Face recognition includes examination of a picture with a database of stored faces keeping in mind the end goal to recognize the individual in the given input picture. The entire procedure covers in three phases face detection, feature extraction and recognition and different strategies are required according to the specified requirements.


VLSI Design ◽  
2011 ◽  
Vol 2011 ◽  
pp. 1-17
Author(s):  
Soumya Pandit ◽  
Chittaranjan Mandal ◽  
Amit Patra

This paper presents a systematic methodology for the generation of high-level performance models for analog component blocks. The transistor sizes of the circuit-level implementations of the component blocks along with a set of geometry constraints applied over them define the sample space. A Halton sequence generator is used as a sampling algorithm. Performance data are generated by simulating each sampled circuit configuration through SPICE. Least squares support vector machine (LS-SVM) is used as a regression function. Optimal values of the model hyper parameters are determined through a grid search-based technique and a genetic algorithm- (GA-) based technique. The high-level models of the individual component blocks are combined analytically to construct the high-level model of a complete system. The constructed performance models have been used to implement a GA-based high-level topology sizing process. The advantages of the present methodology are that the constructed models are accurate with respect to real circuit-level simulation results, fast to evaluate, and have a good generalization ability. In addition, the model construction time is low and the construction process does not require any detailed knowledge of circuit design. The entire methodology has been demonstrated with a set of numerical results.


Sign in / Sign up

Export Citation Format

Share Document