scholarly journals Face masks impair basic emotion recognition: Group effects and individual variability (Accepted for Publication in Social Psychology)

2021 ◽  
Author(s):  
Sarah McCrackin ◽  
Jelena Ristic ◽  
Florence Mayrand ◽  
Francesca Capozzi

With the widespread adoption of masks, there is a need for understanding how facial obstruction affects emotion recognition. We asked 120 participants to identify emotions from faces with and without masks. We also examined if recognition performance was related to autistic traits and personality. Masks impacted recognition of expressions with diagnostic lower face features the most and those with diagnostic upper face features the least. Persons with higher autistic traits were worse at identifying unmasked expressions, while persons with lower extraversion and higher agreeableness were better at recognizing masked expressions. These results show that different features play different roles in emotion recognition and suggest that obscuring features affects social communication differently as a function of autistic traits and personality.

2021 ◽  
Author(s):  
Sarah McCrackin ◽  
Francesca Capozzi ◽  
Florence Mayrand ◽  
Jelena Ristic

With widespread adoption of mask wearing, the 2020 Covid-19 pandemic highlighted a need for a deeper understanding of how facial feature obstruction affects emotion recognition. Here we asked participants (n=120) to identify disgusted, angry, sad, neutral, surprised, happy, and fearful emotions from faces with and without masks, and examined if recognition performance was related to their level of social competence and personality traits. Performance was reduced for all masked relative to unmasked emotions. Masks impacted recognition of expressions with diagnostic lower face features the most (disgust, anger) and those with diagnostic upper face features the least (fear, surprise). Recognition performance also varied at the individual level. Persons with higher overall social competence were better at identifying unmasked expressions, while persons with lower trait extraversion and higher trait agreeableness were better at recognizing masked expressions. These results reveal novel insights about the role of face features in emotion recognition and show that obscuring facial features affects social communication differently as a function of individual social competence and personality traits.


Sensors ◽  
2020 ◽  
Vol 20 (4) ◽  
pp. 1199
Author(s):  
Seho Park ◽  
Kunyoung Lee ◽  
Jae-A Lim ◽  
Hyunwoong Ko ◽  
Taehoon Kim ◽  
...  

Research on emotion recognition from facial expressions has found evidence of different muscle movements between genuine and posed smiles. To further confirm discrete movement intensities of each facial segment, we explored differences in facial expressions between spontaneous and posed smiles with three-dimensional facial landmarks. Advanced machine analysis was adopted to measure changes in the dynamics of 68 segmented facial regions. A total of 57 normal adults (19 men, 38 women) who displayed adequate posed and spontaneous facial expressions for happiness were included in the analyses. The results indicate that spontaneous smiles have higher intensities for upper face than lower face. On the other hand, posed smiles showed higher intensities in the lower part of the face. Furthermore, the 3D facial landmark technique revealed that the left eyebrow displayed stronger intensity during spontaneous smiles than the right eyebrow. These findings suggest a potential application of landmark based emotion recognition that spontaneous smiles can be distinguished from posed smiles via measuring relative intensities between the upper and lower face with a focus on left-sided asymmetry in the upper region.


Autism ◽  
2020 ◽  
Vol 24 (8) ◽  
pp. 2304-2309 ◽  
Author(s):  
Alex Bertrams ◽  
Katja Schlegel

People with diagnosed autism or being high in autistic traits have been found to have difficulties with recognizing emotions from nonverbal expressions. In this study, we investigated whether speeded reasoning (reasoning performance under time pressure) moderates the inverse relationship between autistic traits and emotion recognition performance. We expected the negative correlation between autistic traits and emotion recognition to be less strong when speeded reasoning was high. The underlying assumption is that people high in autistic traits can compensate for their low intuition in recognizing emotions through quick analytical information processing. A paid online sample ( N = 217) completed the 10-item version of the Autism Spectrum Quotient, two emotion recognition tests using videos with sound (Geneva Emotion Recognition Test) and pictures (Reading the Mind in the Eyes Test), and Baddeley’s Grammatical Reasoning Test to measure speeded reasoning. As expected, the inverse relationship between autistic traits and emotion recognition performance was less pronounced for individuals with high compared to low speeded reasoning ability. These results suggest that a high ability in making quick mental inferences may (partly) compensate for difficulties with intuitive emotion recognition related to autistic traits. Lay abstract Autistic people typically have difficulty recognizing other people’s emotions and to process nonverbal cues in an automatic, intuitive fashion. This usually also applies to people who—regardless of an official diagnosis of autism—achieve high values in autism questionnaires. However, some autistic people do not seem to have any problems with emotion recognition. One explanation may be that these individuals are able to compensate for their lack of intuitive or automatic processing through a quick conscious and deliberate analysis of the emotional cues in faces, voices, and body movements. On these grounds, we assumed that the higher autistic people’s ability to reason quickly (i.e. to make quick logical inferences), the fewer problems they should have with determining other people’s emotions. In our study, we asked workers on the crowdsourcing marketplace MTurk to complete a questionnaire about their autistic traits, to perform emotion recognition tests, and to complete a test of the ability to reason under time constraints. In our sample of 217 people, we found the expected pattern. Overall, those who had higher values in the autism questionnaire scored lower in the emotion recognition tests. However, when reasoning ability was taken into account, a more nuanced picture emerged: participants with high values both on the autism questionnaire and on the reasoning test recognized emotions as well as individuals with low autistic traits. Our results suggest that fast analytic information processing may help autistic people to compensate problems in recognizing others’ emotions.


2020 ◽  
Vol 9 (4) ◽  
pp. 1057 ◽  
Author(s):  
Jess Kerr-Gaffney ◽  
Luke Mason ◽  
Emily Jones ◽  
Hannah Hayward ◽  
Jumana Ahmad ◽  
...  

Difficulties in socio-emotional functioning are proposed to contribute to the development and maintenance of anorexia nervosa (AN). This study aimed to examine emotion recognition abilities in individuals in the acute and recovered stages of AN compared to healthy controls (HCs). A second aim was to examine whether attention to faces and comorbid psychopathology predicted emotion recognition abilities. The films expressions task was administered to 148 participants (46 AN, 51 recovered AN, 51 HC) to assess emotion recognition, during which attention to faces was recorded using eye-tracking. Comorbid psychopathology was assessed using self-report questionnaires and the Autism Diagnostic Observation Schedule–2nd edition (ADOS-2). No significant differences in emotion recognition abilities or attention to faces were found between groups. However, individuals with a lifetime history of AN who scored above the clinical cut-off on the ADOS-2 displayed poorer emotion recognition performance than those scoring below cut-off and HCs. ADOS-2 scores significantly predicted emotion recognition abilities while controlling for group membership and intelligence. Difficulties in emotion recognition appear to be associated with high autism spectrum disorder (ASD) traits, rather than a feature of AN. Whether individuals with AN and high ASD traits may require different treatment strategies or adaptations is a question for future research.


2019 ◽  
Author(s):  
Alex Bertrams ◽  
Katja Schlegel

People high in autistic-like traits have been found to have difficulties with recognizing emotions from nonverbal expressions. However, findings on the autism—emotion recognition relationship are inconsistent. In the present study, we investigated whether speeded reasoning ability (reasoning performance under time pressure) moderates the inverse relationship between autistic-like traits and emotion recognition performance. We expected the negative correlation between autistic-like traits and emotion recognition to be less strong when speeded reasoning ability was high. MTurkers (N = 217) completed the ten item version of the Autism Spectrum Quotient (AQ-10), two emotion recognition tests using videos with sound (Geneva Emotion Recognition Test, GERT-S) and pictures (Reading the Mind in the Eyes Test, RMET), and Baddeley's Grammatical Reasoning test to measure speeded reasoning. As expected, the higher the ability in speeded reasoning, the less were higher autistic-like traits related to lower emotion recognition performance. These results suggest that a high ability in making quick mental inferences may (partly) compensate for difficulties with intuitive emotion recognition related to autistic-like traits.


Sensors ◽  
2020 ◽  
Vol 21 (1) ◽  
pp. 52
Author(s):  
Tianyi Zhang ◽  
Abdallah El Ali ◽  
Chen Wang ◽  
Alan Hanjalic ◽  
Pablo Cesar

Recognizing user emotions while they watch short-form videos anytime and anywhere is essential for facilitating video content customization and personalization. However, most works either classify a single emotion per video stimuli, or are restricted to static, desktop environments. To address this, we propose a correlation-based emotion recognition algorithm (CorrNet) to recognize the valence and arousal (V-A) of each instance (fine-grained segment of signals) using only wearable, physiological signals (e.g., electrodermal activity, heart rate). CorrNet takes advantage of features both inside each instance (intra-modality features) and between different instances for the same video stimuli (correlation-based features). We first test our approach on an indoor-desktop affect dataset (CASE), and thereafter on an outdoor-mobile affect dataset (MERCA) which we collected using a smart wristband and wearable eyetracker. Results show that for subject-independent binary classification (high-low), CorrNet yields promising recognition accuracies: 76.37% and 74.03% for V-A on CASE, and 70.29% and 68.15% for V-A on MERCA. Our findings show: (1) instance segment lengths between 1–4 s result in highest recognition accuracies (2) accuracies between laboratory-grade and wearable sensors are comparable, even under low sampling rates (≤64 Hz) (3) large amounts of neutral V-A labels, an artifact of continuous affect annotation, result in varied recognition performance.


2021 ◽  
Vol 14 ◽  
pp. 117954762199457
Author(s):  
Daniele Emedoli ◽  
Maddalena Arosio ◽  
Andrea Tettamanti ◽  
Sandro Iannaccone

Background: Buccofacial Apraxia is defined as the inability to perform voluntary movements of the larynx, pharynx, mandible, tongue, lips and cheeks, while automatic or reflexive control of these structures is preserved. Buccofacial Apraxia frequently co-occurs with aphasia and apraxia of speech and it has been reported as almost exclusively resulting from a lesion of the left hemisphere. Recent studies have demonstrated the benefit of treating apraxia using motor training principles such as Augmented Feedback or Action Observation Therapy. In light of this, the study describes the treatment based on immersive Action Observation Therapy and Virtual Reality Augmented Feedback in a case of Buccofacial Apraxia. Participant and Methods: The participant is a right-handed 58-years-old male. He underwent a neurosurgery intervention of craniotomy and exeresis of infra axial expansive lesion in the frontoparietal convexity compatible with an atypical meningioma. Buccofacial Apraxia was diagnosed by a neurologist and evaluated by the Upper and Lower Face Apraxia Test. Buccofacial Apraxia was quantified also by a specific camera, with an appropriately developed software, able to detect the range of motion of automatic face movements and the range of the same movements on voluntary requests. In order to improve voluntary movements, the participant completed fifteen 1-hour rehabilitation sessions, composed of a 20-minutes immersive Action Observation Therapy followed by a 40-minutes Virtual Reality Augmented Feedback sessions, 5 days a week, for 3 consecutive weeks. Results: After treatment, participant achieved great improvements in quality and range of facial movements, performing most of the facial expressions (eg, kiss, smile, lateral angle of mouth displacement) without unsolicited movement. Furthermore, the Upper and Lower Face Apraxia Test showed an improvement of 118% for the Upper Face movements and of 200% for the Lower Face movements. Conclusion: Performing voluntary movement in a Virtual Reality environment with Augmented Feedbacks, in addition to Action Observation Therapy, improved performances of facial gestures and consolidate the activations by the central nervous system based on principles of experience-dependent neural plasticity.


2016 ◽  
Vol 28 (7) ◽  
pp. 1165-1179 ◽  
Author(s):  
J. Pietschnig ◽  
L. Schröder ◽  
I. Ratheiser ◽  
I. Kryspin-Exner ◽  
M. Pflüger ◽  
...  

ABSTRACTBackground:Impairments in facial emotion recognition (FER) have been detected in patients with Parkinson disease (PD). Presently, we aim at assessing differences in emotion recognition performance in PD patient groups with and without mild forms of cognitive impairment (MCI) compared to healthy controls.Methods:Performance on a concise emotion recognition test battery (VERT-K) of three groups of 97 PD patients was compared with an age-equivalent sample of 168 healthy controls. Patients were categorized into groups according to two well-established classifications of MCI according to Petersen's (cognitively intact vs. amnestic MCI, aMCI, vs. non-amnestic MCI, non-aMCI) and Litvan's (cognitively intact vs. single-domain MCI, sMCI, vs. multi-domain MCI, mMCI) criteria. Patients and controls underwent individual assessments using a comprehensive neuropsychological test battery examining attention, executive functioning, language, and memory (Neuropsychological Test Battery Vienna, NTBV), the Beck Depression Inventory, and a measure of premorbid IQ (WST).Results:Cognitively intact PD patients and patients with MCI in PD (PD-MCI) showed significantly worse emotion recognition performance when compared to healthy controls. Between-groups effect sizes were substantial, showing non-trivial effects in all comparisons (Cohen's ds from 0.31 to 1.22). Moreover, emotion recognition performance was higher in women, positively associated with premorbid IQ and negatively associated with age. Depressive symptoms were not related to FER.Conclusions:The present investigation yields further evidence for impaired FER in PD. Interestingly, our data suggest FER deficits even in cognitively intact PD patients indicating FER dysfunction prior to the development of overt cognitive dysfunction. Age showed a negative association whereas IQ showed a positive association with FER.


Sign in / Sign up

Export Citation Format

Share Document