scholarly journals Facial emotion recognition using Min-Max similarity classifier

Author(s):  
Olga Krestinskaya ◽  
Alex Pappachen James

Recognition of human emotions from the imaging templates is useful in a wide variety of human-computer interaction and intelligent systems applications. However, the automatic recognition of facial expressions using image template matching techniques suffer from the natural variability with facial features and recording conditions. In spite of the progress achieved in facial emotion recognition in recent years, the effective and computationally simple feature selection and classification technique for emotion recognition is still an open problem. In this paper, we propose an efficient and straightforward facial emotion recognition algorithm to reduce the problem of inter-class pixel mismatch during classification. The proposed method includes the application of pixel normalization to remove intensity offsets followed-up with a Min-Max metric in a nearest neighbor classifier that is capable of suppressing feature outliers. The results indicate an improvement of recognition performance from 92.85% to 98.57% for the proposed Min-Max classification method when tested on JAFFE database. The proposed emotion recognition technique outperforms the existing template matching methods.

2017 ◽  
Author(s):  
Olga Krestinskaya ◽  
Alex Pappachen James

Recognition of human emotions from the imaging templates is useful in a wide variety of human-computer interaction and intelligent systems applications. However, the automatic recognition of facial expressions using image template matching techniques suffer from the natural variability with facial features and recording conditions. In spite of the progress achieved in facial emotion recognition in recent years, the effective and computationally simple feature selection and classification technique for emotion recognition is still an open problem. In this paper, we propose an efficient and straightforward facial emotion recognition algorithm to reduce the problem of inter-class pixel mismatch during classification. The proposed method includes the application of pixel normalization to remove intensity offsets followed-up with a Min-Max metric in a nearest neighbor classifier that is capable of suppressing feature outliers. The results indicate an improvement of recognition performance from 92.85% to 98.57% for the proposed Min-Max classification method when tested on JAFFE database. The proposed emotion recognition technique outperforms the existing template matching methods.


2020 ◽  
Author(s):  
Nazire Duran ◽  
ANTHONY P. ATKINSON

Certain facial features provide useful information for recognition of facial expressions. In two experiments, we investigated whether foveating informative features of briefly presented expressions improves recognition accuracy and whether these features are targeted reflexively when not foveated. Angry, fearful, surprised, and sad or disgusted expressions were presented briefly at locations which would ensure foveation of specific features. Foveating the mouth of fearful, surprised and disgusted expressions improved emotion recognition compared to foveating an eye or cheek or the central brow. Foveating the brow lead to equivocal results in anger recognition across the two experiments, which might be due to the different combination of emotions used. There was no consistent evidence suggesting that reflexive first saccades targeted emotion-relevant features; instead, they targeted the closest feature to initial fixation. In a third experiment, angry, fearful, surprised and disgusted expressions were presented for 5 seconds. Duration of task-related fixations in the eyes, brow, nose and mouth regions was modulated by the presented expression. Moreover, longer fixation at the mouth positively correlated with anger and disgust accuracy both when these expressions were freely viewed (Experiment 3) and when briefly presented at the mouth (Experiment 2). Finally, an overall preference to fixate the mouth across all expressions correlated positively with anger and disgust accuracy. These findings suggest that foveal processing of informative features is functional/contributory to emotion recognition, but they are not automatically sought out when not foveated, and that facial emotion recognition performance is related to idiosyncratic gaze behaviour.


PLoS ONE ◽  
2021 ◽  
Vol 16 (12) ◽  
pp. e0260814
Author(s):  
Nazire Duran ◽  
Anthony P. Atkinson

Certain facial features provide useful information for recognition of facial expressions. In two experiments, we investigated whether foveating informative features of briefly presented expressions improves recognition accuracy and whether these features are targeted reflexively when not foveated. Angry, fearful, surprised, and sad or disgusted expressions were presented briefly at locations which would ensure foveation of specific features. Foveating the mouth of fearful, surprised and disgusted expressions improved emotion recognition compared to foveating an eye or cheek or the central brow. Foveating the brow led to equivocal results in anger recognition across the two experiments, which might be due to the different combination of emotions used. There was no consistent evidence suggesting that reflexive first saccades targeted emotion-relevant features; instead, they targeted the closest feature to initial fixation. In a third experiment, angry, fearful, surprised and disgusted expressions were presented for 5 seconds. Duration of task-related fixations in the eyes, brow, nose and mouth regions was modulated by the presented expression. Moreover, longer fixation at the mouth positively correlated with anger and disgust accuracy both when these expressions were freely viewed (Experiment 2b) and when briefly presented at the mouth (Experiment 2a). Finally, an overall preference to fixate the mouth across all expressions correlated positively with anger and disgust accuracy. These findings suggest that foveal processing of informative features is functional/contributory to emotion recognition, but they are not automatically sought out when not foveated, and that facial emotion recognition performance is related to idiosyncratic gaze behaviour.


2016 ◽  
Vol 28 (7) ◽  
pp. 1165-1179 ◽  
Author(s):  
J. Pietschnig ◽  
L. Schröder ◽  
I. Ratheiser ◽  
I. Kryspin-Exner ◽  
M. Pflüger ◽  
...  

ABSTRACTBackground:Impairments in facial emotion recognition (FER) have been detected in patients with Parkinson disease (PD). Presently, we aim at assessing differences in emotion recognition performance in PD patient groups with and without mild forms of cognitive impairment (MCI) compared to healthy controls.Methods:Performance on a concise emotion recognition test battery (VERT-K) of three groups of 97 PD patients was compared with an age-equivalent sample of 168 healthy controls. Patients were categorized into groups according to two well-established classifications of MCI according to Petersen's (cognitively intact vs. amnestic MCI, aMCI, vs. non-amnestic MCI, non-aMCI) and Litvan's (cognitively intact vs. single-domain MCI, sMCI, vs. multi-domain MCI, mMCI) criteria. Patients and controls underwent individual assessments using a comprehensive neuropsychological test battery examining attention, executive functioning, language, and memory (Neuropsychological Test Battery Vienna, NTBV), the Beck Depression Inventory, and a measure of premorbid IQ (WST).Results:Cognitively intact PD patients and patients with MCI in PD (PD-MCI) showed significantly worse emotion recognition performance when compared to healthy controls. Between-groups effect sizes were substantial, showing non-trivial effects in all comparisons (Cohen's ds from 0.31 to 1.22). Moreover, emotion recognition performance was higher in women, positively associated with premorbid IQ and negatively associated with age. Depressive symptoms were not related to FER.Conclusions:The present investigation yields further evidence for impaired FER in PD. Interestingly, our data suggest FER deficits even in cognitively intact PD patients indicating FER dysfunction prior to the development of overt cognitive dysfunction. Age showed a negative association whereas IQ showed a positive association with FER.


Author(s):  
Zhengxu Lian ◽  
Yingjie Guo ◽  
Xinyu Cao ◽  
Wendi Li

A wearable device system was proposed in the present work to address the problem of facial emotion recognition disorders. The proposed system could comprehensively analyze the user’s own stress status, emotions of people around, and the surrounding environment. The system consists of a multi-dimensional physiological signals acquisition module, an image acquisition and transmission module, a user interface of the user mobile terminal, and a cloud database for data storage. Moreover, a deep learning based multi-model physiological signal pressure recognition algorithm and a facial emotion recognition algorithm were designed and implemented in the system. Some publicly available data sets were used to test the two algorithms, and the experiment results showed that the two algorithms could well realize the expected functions of the system.


2020 ◽  
pp. 1-12
Author(s):  
Yan Gong ◽  
Sha Rina

Due to the limitations of the learning environment and unguided guidance, students’ autonomous learning of foreign languages after class is not effective. In order to improve the efficiency of autonomous learning of foreign languages, this paper builds a foreign language self-learning system based on facial emotion recognition algorithm and cloud computing platform. Moreover, this paper uses emotion recognition algorithms to identify students’ status and guide them to improve students’ autonomous learning and improve the system’s operating efficiency through cloud computing platforms. In addition, this article combines the needs of autonomous learning to perform facial emotion matching and builds the corresponding functional modules of the system according to the requirements of autonomous learning and designs a sophisticated three-level network structure to achieve a balance between detection performance and real-time performance. In order to verify the performance of the system, an experiment was carried out through the accuracy rate of student’s autonomous state emotion recognition and the English improvement of students’ autonomous learning. The research results show that the foreign language autonomous learning system constructed in this paper has good performance.


Sensors ◽  
2019 ◽  
Vol 19 (8) ◽  
pp. 1897 ◽  
Author(s):  
Dhwani Mehta ◽  
Mohammad Faridul Haque Siddiqui ◽  
Ahmad Y. Javaid

Over the past two decades, automatic facial emotion recognition has received enormous attention. This is due to the increase in the need for behavioral biometric systems and human–machine interaction where the facial emotion recognition and the intensity of emotion play vital roles. The existing works usually do not encode the intensity of the observed facial emotion and even less involve modeling the multi-class facial behavior data jointly. Our work involves recognizing the emotion along with the respective intensities of those emotions. The algorithms used in this comparative study are Gabor filters, a Histogram of Oriented Gradients (HOG), and Local Binary Pattern (LBP) for feature extraction. For classification, we have used Support Vector Machine (SVM), Random Forest (RF), and Nearest Neighbor Algorithm (kNN). This attains emotion recognition and intensity estimation of each recognized emotion. This is a comparative study of classifiers used for facial emotion recognition along with the intensity estimation of those emotions for databases. The results verified that the comparative study could be further used in real-time behavioral facial emotion and intensity of emotion recognition.


2018 ◽  
Vol 1 (2) ◽  
pp. 53-60
Author(s):  
Giuseppe Carrà ◽  
Giulia Brambilla ◽  
Manuela Caslini ◽  
Francesca Parma ◽  
Alessandro Chinello ◽  
...  

AbstractObjectivesSince evidence on executive control among women with Anorexia or Bulimia Nervosa (AN/BN) are somehow inconclusive, we aimed to explore whether performance in set-shifting in AN/BN might be influenced by Facial Emotion Recognition (FER).MethodsWe randomly recruited women with a diagnosis of AN or BN, from an Eating Disorders Outpatient Clinic in Italy, as well as healthy controls (HCs). We evaluated with established tools: diagnosis (Eating Disorder Examination- EDE-17.0), executive control (Intra-Extra Dimensional Set Shift-IED) and FER (Ekman 60 Faces Test-EK-60F). Univariate distributions by diagnostic subgroups were assessed on sociodemographic and clinical variables, which were selected for subsequent multiple linear regression analyses.ResultsWomen with AN performed significantly worse than HCs on IED adjusted total errors. HCs scored significantly better than AN and BN on EK-60F fear subscale. Although IED set shifting was associated (p = 0.008) with AN, after controlling for age, EK-60F fear subscale, alexithymia and depression (i.e., clinically relevant covariates identified a priori from the literature, or associated with AN/BN at univariate level), this association could not be confirmed.ConclusionsImpaired executive control may not be a distinctive feature in women with AN, since several clinical characteristics, including fear recognition ability, are likely to have an important role. This has significant implications for relevant interventions in AN, which should aim at also improving socio-emotional processing.


2014 ◽  
Vol 20 (10) ◽  
pp. 1004-1014 ◽  
Author(s):  
Cinzia Cecchetto ◽  
Marilena Aiello ◽  
Delia D’Amico ◽  
Daniela Cutuli ◽  
Daniela Cargnelutti ◽  
...  

AbstractMultiple sclerosis (MS) may be associated with impaired perception of facial emotions. However, emotion recognition mediated by bodily postures has never been examined in these patients. Moreover, several studies have suggested a relation between emotion recognition impairments and alexithymia. This is in line with the idea that the ability to recognize emotions requires the individuals to be able to understand their own emotions. Despite a deficit in emotion recognition has been observed in MS patients, the association between impaired emotion recognition and alexithymia has received little attention. The aim of this study was, first, to investigate MS patient’s abilities to recognize emotions mediated by both facial and bodily expressions and, second, to examine whether any observed deficits in emotions recognition could be explained by the presence of alexithymia. Thirty patients with MS and 30 healthy matched controls performed experimental tasks assessing emotion discrimination and recognition of facial expressions and bodily postures. Moreover, they completed questionnaires evaluating alexithymia, depression, and fatigue. First, facial emotion recognition and, to a lesser extent, bodily emotion recognition can be impaired in MS patients. In particular, patients with higher disability showed an impairment in emotion recognition compared with patients with lower disability and controls. Second, their deficit in emotion recognition was not predicted by alexithymia. Instead, the disease’s characteristics and the performance on some cognitive tasks significantly correlated with emotion recognition. Impaired facial emotion recognition is a cognitive signature of MS that is not dependent on alexithymia. (JINS, 2014, 19, 1–11)


Sign in / Sign up

Export Citation Format

Share Document