Age Bias in Emotion Detection: An Analysis of Facial Emotion Recognition Performance on Young, Middle-Aged, and Older Adults

Author(s):  
Eugenia Kim ◽  
De'Aira Bryant ◽  
Deepak Srikanth ◽  
Ayanna Howard
2016 ◽  
Vol 28 (7) ◽  
pp. 1165-1179 ◽  
Author(s):  
J. Pietschnig ◽  
L. Schröder ◽  
I. Ratheiser ◽  
I. Kryspin-Exner ◽  
M. Pflüger ◽  
...  

ABSTRACTBackground:Impairments in facial emotion recognition (FER) have been detected in patients with Parkinson disease (PD). Presently, we aim at assessing differences in emotion recognition performance in PD patient groups with and without mild forms of cognitive impairment (MCI) compared to healthy controls.Methods:Performance on a concise emotion recognition test battery (VERT-K) of three groups of 97 PD patients was compared with an age-equivalent sample of 168 healthy controls. Patients were categorized into groups according to two well-established classifications of MCI according to Petersen's (cognitively intact vs. amnestic MCI, aMCI, vs. non-amnestic MCI, non-aMCI) and Litvan's (cognitively intact vs. single-domain MCI, sMCI, vs. multi-domain MCI, mMCI) criteria. Patients and controls underwent individual assessments using a comprehensive neuropsychological test battery examining attention, executive functioning, language, and memory (Neuropsychological Test Battery Vienna, NTBV), the Beck Depression Inventory, and a measure of premorbid IQ (WST).Results:Cognitively intact PD patients and patients with MCI in PD (PD-MCI) showed significantly worse emotion recognition performance when compared to healthy controls. Between-groups effect sizes were substantial, showing non-trivial effects in all comparisons (Cohen's ds from 0.31 to 1.22). Moreover, emotion recognition performance was higher in women, positively associated with premorbid IQ and negatively associated with age. Depressive symptoms were not related to FER.Conclusions:The present investigation yields further evidence for impaired FER in PD. Interestingly, our data suggest FER deficits even in cognitively intact PD patients indicating FER dysfunction prior to the development of overt cognitive dysfunction. Age showed a negative association whereas IQ showed a positive association with FER.


2019 ◽  
Vol 76 ◽  
pp. e12
Author(s):  
Dominique Piber ◽  
Naomi I. Eisenberger ◽  
Richard Olmstead ◽  
Joshua Hyong-Jin Cho ◽  
Teresa E. Seeman ◽  
...  

2020 ◽  
Author(s):  
Nazire Duran ◽  
ANTHONY P. ATKINSON

Certain facial features provide useful information for recognition of facial expressions. In two experiments, we investigated whether foveating informative features of briefly presented expressions improves recognition accuracy and whether these features are targeted reflexively when not foveated. Angry, fearful, surprised, and sad or disgusted expressions were presented briefly at locations which would ensure foveation of specific features. Foveating the mouth of fearful, surprised and disgusted expressions improved emotion recognition compared to foveating an eye or cheek or the central brow. Foveating the brow lead to equivocal results in anger recognition across the two experiments, which might be due to the different combination of emotions used. There was no consistent evidence suggesting that reflexive first saccades targeted emotion-relevant features; instead, they targeted the closest feature to initial fixation. In a third experiment, angry, fearful, surprised and disgusted expressions were presented for 5 seconds. Duration of task-related fixations in the eyes, brow, nose and mouth regions was modulated by the presented expression. Moreover, longer fixation at the mouth positively correlated with anger and disgust accuracy both when these expressions were freely viewed (Experiment 3) and when briefly presented at the mouth (Experiment 2). Finally, an overall preference to fixate the mouth across all expressions correlated positively with anger and disgust accuracy. These findings suggest that foveal processing of informative features is functional/contributory to emotion recognition, but they are not automatically sought out when not foveated, and that facial emotion recognition performance is related to idiosyncratic gaze behaviour.


PLoS ONE ◽  
2021 ◽  
Vol 16 (12) ◽  
pp. e0260814
Author(s):  
Nazire Duran ◽  
Anthony P. Atkinson

Certain facial features provide useful information for recognition of facial expressions. In two experiments, we investigated whether foveating informative features of briefly presented expressions improves recognition accuracy and whether these features are targeted reflexively when not foveated. Angry, fearful, surprised, and sad or disgusted expressions were presented briefly at locations which would ensure foveation of specific features. Foveating the mouth of fearful, surprised and disgusted expressions improved emotion recognition compared to foveating an eye or cheek or the central brow. Foveating the brow led to equivocal results in anger recognition across the two experiments, which might be due to the different combination of emotions used. There was no consistent evidence suggesting that reflexive first saccades targeted emotion-relevant features; instead, they targeted the closest feature to initial fixation. In a third experiment, angry, fearful, surprised and disgusted expressions were presented for 5 seconds. Duration of task-related fixations in the eyes, brow, nose and mouth regions was modulated by the presented expression. Moreover, longer fixation at the mouth positively correlated with anger and disgust accuracy both when these expressions were freely viewed (Experiment 2b) and when briefly presented at the mouth (Experiment 2a). Finally, an overall preference to fixate the mouth across all expressions correlated positively with anger and disgust accuracy. These findings suggest that foveal processing of informative features is functional/contributory to emotion recognition, but they are not automatically sought out when not foveated, and that facial emotion recognition performance is related to idiosyncratic gaze behaviour.


Brain Injury ◽  
2018 ◽  
Vol 33 (3) ◽  
pp. 322-332 ◽  
Author(s):  
Lindsey Byom ◽  
Melissa Duff ◽  
Bilge Mutlu ◽  
Lyn Turkstra

Electronics ◽  
2021 ◽  
Vol 10 (11) ◽  
pp. 1289
Author(s):  
Navjot Rathour ◽  
Sultan S. Alshamrani ◽  
Rajesh Singh ◽  
Anita Gehlot ◽  
Mamoon Rashid ◽  
...  

Facial emotion recognition (FER) is the procedure of identifying human emotions from facial expressions. It is often difficult to identify the stress and anxiety levels of an individual through the visuals captured from computer vision. However, the technology enhancements on the Internet of Medical Things (IoMT) have yielded impressive results from gathering various forms of emotional and physical health-related data. The novel deep learning (DL) algorithms are allowing to perform application in a resource-constrained edge environment, encouraging data from IoMT devices to be processed locally at the edge. This article presents an IoMT based facial emotion detection and recognition system that has been implemented in real-time by utilizing a small, powerful, and resource-constrained device known as Raspberry-Pi with the assistance of deep convolution neural networks. For this purpose, we have conducted one empirical study on the facial emotions of human beings along with the emotional state of human beings using physiological sensors. It then proposes a model for the detection of emotions in real-time on a resource-constrained device, i.e., Raspberry-Pi, along with a co-processor, i.e., Intel Movidius NCS2. The facial emotion detection test accuracy ranged from 56% to 73% using various models, and the accuracy has become 73% performed very well with the FER 2013 dataset in comparison to the state of art results mentioned as 64% maximum. A t-test is performed for extracting the significant difference in systolic, diastolic blood pressure, and the heart rate of an individual watching three different subjects (angry, happy, and neutral).


2018 ◽  
Vol 1 (2) ◽  
pp. 53-60
Author(s):  
Giuseppe Carrà ◽  
Giulia Brambilla ◽  
Manuela Caslini ◽  
Francesca Parma ◽  
Alessandro Chinello ◽  
...  

AbstractObjectivesSince evidence on executive control among women with Anorexia or Bulimia Nervosa (AN/BN) are somehow inconclusive, we aimed to explore whether performance in set-shifting in AN/BN might be influenced by Facial Emotion Recognition (FER).MethodsWe randomly recruited women with a diagnosis of AN or BN, from an Eating Disorders Outpatient Clinic in Italy, as well as healthy controls (HCs). We evaluated with established tools: diagnosis (Eating Disorder Examination- EDE-17.0), executive control (Intra-Extra Dimensional Set Shift-IED) and FER (Ekman 60 Faces Test-EK-60F). Univariate distributions by diagnostic subgroups were assessed on sociodemographic and clinical variables, which were selected for subsequent multiple linear regression analyses.ResultsWomen with AN performed significantly worse than HCs on IED adjusted total errors. HCs scored significantly better than AN and BN on EK-60F fear subscale. Although IED set shifting was associated (p = 0.008) with AN, after controlling for age, EK-60F fear subscale, alexithymia and depression (i.e., clinically relevant covariates identified a priori from the literature, or associated with AN/BN at univariate level), this association could not be confirmed.ConclusionsImpaired executive control may not be a distinctive feature in women with AN, since several clinical characteristics, including fear recognition ability, are likely to have an important role. This has significant implications for relevant interventions in AN, which should aim at also improving socio-emotional processing.


2015 ◽  
Vol 28 (3) ◽  
pp. 477-485 ◽  
Author(s):  
J. Pietschnig ◽  
R. Aigner-Wöber ◽  
N. Reischenböck ◽  
I. Kryspin-Exner ◽  
D. Moser ◽  
...  

ABSTRACTBackground:Deficits in facial emotion recognition (FER) have been shown to substantially impair several aspects in everyday life of affected individuals (e.g. social functioning). Presently, we aim at assessing differences in emotion recognition performance in three patient groups suffering from mild forms of cognitive impairment compared to healthy controls.Methods:Performance on a concise emotion recognition test battery (VERT-K) of 68 patients with subjective cognitive decline (SCD), 44 non-amnestic (non-aMCI), and 25 amnestic patients (aMCI) with mild cognitive impairment (MCI) was compared with an age-equivalent sample of 138 healthy controls all of which were recruited within the framework of the Vienna Conversion to Dementia Study. Additionally, patients and controls underwent individual assessment using a comprehensive neuropsychological test battery examining attention, executive functioning, language, and memory (NTBV), the Beck Depression Inventory (BDI), and a measure of premorbid IQ (WST).Results:Type of diagnosis showed a significant effect on emotion recognition performance, indicating progressively deteriorating results as severity of diagnosis increased. Between-groups effect sizes were substantial, showing non-trivial effects in all comparisons (Cohen's ds from −0.30 to −0.83) except for SCD versus controls. Moreover, emotion recognition performance was higher in women and positively associated with premorbid IQ.Conclusions:Our findings indicate substantial effects of progressive neurological damage on emotion recognition in patients. Importantly, emotion recognition deficits were observable in non-amnestic patients as well, thus conceivably suggesting associations between decreased recognition performance and global cognitive decline. Premorbid IQ appears to act as protective factor yielding lesser deficits in patients showing higher IQs.


2008 ◽  
Vol 20 (4) ◽  
pp. 721-733 ◽  
Author(s):  
Andrea S. Heberlein ◽  
Alisa A. Padon ◽  
Seth J. Gillihan ◽  
Martha J. Farah ◽  
Lesley K. Fellows

The ventromedial prefrontal cortex has been implicated in a variety of emotion processes. However, findings regarding the role of this region specifically in emotion recognition have been mixed. We used a sensitive facial emotion recognition task to compare the emotion recognition performance of 7 subjects with lesions confined to ventromedial prefrontal regions, 8 subjects with lesions elsewhere in prefrontal cortex, and 16 healthy control subjects. We found that emotion recognition was impaired following ventromedial, but not dorsal or lateral, prefrontal damage. This impairment appeared to be quite general, with lower overall ratings or more confusion between all six emotions examined. We also explored the relationship between emotion recognition performance and the ability of the same patients to experience transient happiness and sadness during a laboratory mood induction. We found some support for a relationship between sadness recognition and experience. Taken together, our results indicate that the ventromedial frontal lobe plays a crucial role in facial emotion recognition, and suggest that this deficit may be related to the subjective experience of emotion.


Sign in / Sign up

Export Citation Format

Share Document