Assessment of perception of morphed facial expressions using the Emotion Recognition Task: Normative data from healthy participants aged 8-75

2013 ◽  
Vol 8 (1) ◽  
pp. 75-93 ◽  
Author(s):  
Roy P.C. Kessels ◽  
Barbara Montagne ◽  
Angelique W. Hendriks ◽  
David I. Perrett ◽  
Edward H.F. de Haan
2020 ◽  
Author(s):  
Parama Gupta ◽  
Deepshikha Ray ◽  
Sukanto Sarkar

The current study explored the mediating role of Neuroticism and Psychoticism involving young adult healthy participants who performed a facial emotion recognition task.


2020 ◽  
Author(s):  
Connor Tom Keating ◽  
Sophie L Sowden ◽  
Dagmar S Fraser ◽  
Jennifer L Cook

Abstract A burgeoning literature suggests that alexithymia, and not autism, is responsible for the difficulties with static emotion recognition that are documented in the autistic population. Here we investigate whether alexithymia can also account for difficulties with dynamic facial expressions. Autistic and control adults (N=60) matched on age, gender, non-verbal reasoning ability and alexithymia, completed an emotion recognition task, which employed dynamic point light displays of emotional facial expressions that varied in speed and spatial exaggeration. The ASD group exhibited significantly lower recognition accuracy for angry, but not happy or sad, expressions with normal speed and spatial exaggeration. The level of autistic, and not alexithymic, traits was a significant predictor of accuracy for angry expressions with normal speed and spatial exaggeration.


2018 ◽  
Vol 8 (12) ◽  
pp. 219 ◽  
Author(s):  
Mayra Gutiérrez-Muñoz ◽  
Martha Fajardo-Araujo ◽  
Erika González-Pérez ◽  
Victor Aguirre-Arzola ◽  
Silvia Solís-Ortiz

Polymorphisms of the estrogen receptor ESR1 and ESR2 genes have been linked with cognitive deficits and affective disorders. The effects of these genetic variants on emotional processing in females with low estrogen levels are not well known. The aim was to explore the impact of the ESR1 and ESR2 genes on the responses to the facial emotion recognition task in females. Postmenopausal healthy female volunteers were genotyped for the polymorphisms Xbal and PvuII of ESR1 and the polymorphism rs1256030 of ESR2. The effect of these polymorphisms on the response to the facial emotion recognition of the emotions happiness, sadness, disgust, anger, surprise, and fear was analyzed. Females carrying the P allele of the PvuII polymorphism or the X allele of the Xbal polymorphism of ESR1 easily recognized facial expressions of sadness that were more difficult for the women carrying the p allele or the x allele. They displayed higher accuracy, fast response time, more correct responses, and fewer omissions to complete the task, with a large effect size. Women carrying the ESR2 C allele of ESR2 showed a faster response time for recognizing facial expressions of anger. These findings link ESR1 and ESR2 polymorphisms in facial emotion recognition of negative emotions.


2019 ◽  
Author(s):  
Paddy Ross ◽  
Tessa R. Flack

Emotion perception research has largely been dominated by work on facial expressions, but emotion is also strongly conveyed from the body. Research exploring emotion recognition from the body tends to refer to ‘the body’ as a whole entity. However, the body is made up of different components (hands, arms, trunk etc.), all of which could be differentially contributing to emotion recognition. We know that the hands can help to convey actions, and in particular are important for social communication through gestures, but we currently do not know to what extent the hands influence emotion recognition from the body. Here, 93 adults viewed static emotional body stimuli with either the hands, arms, or both components removed and completed a forced-choice emotion recognition task. Removing the hands significantly reduced recognition accuracy for fear and anger, but made no significant difference to the recognition of happiness and sadness. Removing the arms had no effect on emotion recognition accuracy compared to the full-body stimuli. These results suggest the hands may play a key role in the recognition of emotions from the body.


2021 ◽  
Author(s):  
Kai Klepzig ◽  
Julia Wendt ◽  
Bettina Sarnowski ◽  
Alfons O. Hamm ◽  
Martin Lotze

Abstract Single case studies about patients with unilateral insular lesions reported deficits in emotion recognition from facial expressions. However, there is no consensus about both the actual extent of impairments and the role of lesion lateralization. To investigate associations of brain lesions and impairments in a facial emotion recognition task, we used voxel-based lesion-symptom mapping (VLSM) in a group of 29 stroke patients in the chronic stage, 16 with left and 13 with right hemispheric lesion. Recognition accuracy was impaired for fearful and angry expressions in patients with left hemispheric lesions compared to 14 matched healthy controls. VLSM analyses revealed that lesions centered around the left insula were associated with impaired recognition of emotional facial expressions. We here demonstrate a critical role for the left insula in decoding unpleasant emotions from facial expressions and therefore present further evidence for a broader role for the insular cortex not restricted to disgust processing.


Perception ◽  
2019 ◽  
Vol 49 (1) ◽  
pp. 98-112 ◽  
Author(s):  
Paddy Ross ◽  
Tessa Flack

Emotion perception research has largely been dominated by work on facial expressions, but emotion is also strongly conveyed from the body. Research exploring emotion recognition from the body tends to refer to “the body” as a whole entity. However, the body is made up of different components (hands, arms, trunk, etc.), all of which could be differentially contributing to emotion recognition. We know that the hands can help to convey actions and, in particular, are important for social communication through gestures, but we currently do not know to what extent the hands influence emotion recognition from the body. Here, 93 adults viewed static emotional body stimuli with either the hands, arms, or both components removed and completed a forced-choice emotion recognition task. Removing the hands significantly reduced recognition accuracy for fear and anger but made no significant difference to the recognition of happiness and sadness. Removing the arms had no effect on emotion recognition accuracy compared with the full-body stimuli. These results suggest the hands may play a key role in the recognition of emotions from the body.


2019 ◽  
Vol 25 (05) ◽  
pp. 453-461 ◽  
Author(s):  
Katherine Osborne-Crowley ◽  
Sophie C. Andrews ◽  
Izelle Labuschagne ◽  
Akshay Nair ◽  
Rachael Scahill ◽  
...  

AbstractObjectives: Previous research has demonstrated an association between emotion recognition and apathy in several neurological conditions involving fronto-striatal pathology, including Parkinson’s disease and brain injury. In line with these findings, we aimed to determine whether apathetic participants with early Huntington’s disease (HD) were more impaired on an emotion recognition task compared to non-apathetic participants and healthy controls. Methods: We included 43 participants from the TRACK-HD study who reported apathy on the Problem Behaviours Assessment – short version (PBA-S), 67 participants who reported no apathy, and 107 controls matched for age, sex, and level of education. During their baseline TRACK-HD visit, participants completed a battery of cognitive and psychological tests including an emotion recognition task, the Hospital Depression and Anxiety Scale (HADS) and were assessed on the PBA-S. Results: Compared to the non-apathetic group and the control group, the apathetic group were impaired on the recognition of happy facial expressions, after controlling for depression symptomology on the HADS and general disease progression (Unified Huntington’s Disease Rating Scale total motor score). This was despite no difference between the apathetic and non-apathetic group on overall cognitive functioning assessed by a cognitive composite score. Conclusions: Impairment of the recognition of happy expressions may be part of the clinical picture of apathy in HD. While shared reliance on frontostriatal pathways may broadly explain associations between emotion recognition and apathy found across several patient groups, further work is needed to determine what relationships exist between recognition of specific emotions, distinct subtypes of apathy and underlying neuropathology. (JINS, 2019, 25, 453–461)


2020 ◽  
Author(s):  
Connor Tom Keating ◽  
Sophie L Sowden ◽  
Dagmar S Fraser ◽  
Jennifer L Cook

Abstract A burgeoning literature suggests that alexithymia, and not autism, is responsible for the difficulties with static emotion recognition that are documented in the autistic population. Here we investigate whether alexithymia can also account for difficulties with dynamic facial expressions. Autistic and control adults (N=60) matched on age, gender, non-verbal reasoning ability and alexithymia, completed an emotion recognition task, which employed dynamic point light displays of emotional facial expressions that varied in speed and spatial exaggeration. The ASD group exhibited significantly lower recognition accuracy for angry, but not happy or sad, expressions with normal speed and spatial exaggeration. The level of autistic, and not alexithymic, traits was a significant predictor of accuracy for angry expressions with normal speed and spatial exaggeration.


2021 ◽  
Vol 12 ◽  
Author(s):  
Teresa Cordeiro ◽  
Júlia Botelho ◽  
Catarina Mendonça

The aim of this study was to assess the relationship between the self-concept of children and their ability to recognize emotions in others from facial expressions. It is hypothesized that children use their self-representations to interpret depictions of emotion in others and that higher self-concepts might be associated with earlier development of emotion recognition skills. A total of 54 children aged between 5 and 11 years participated in this study. Self-concept was assessed in all children using the Piers-Harris Self-Concept Scale for Children (Piers-Harris 2). To assess emotion recognition, a computerized instrument, the Penn Emotion Recognition Task (PERT), was applied. Despite the small sample of children, results show clear statistical effects. It is shown that emotion recognition ability is directly correlated with self-concept for intellectual/school status. The ability to correctly identify emotions from facial expressions is affected by general self-concept, intellectual/school status, and stimulus features of gender, intensity, and emotion. Further analysis shows that the general self-concept of children particularly affects the ability to identify happy faces. Children with a higher intellectual status score recognize happiness and neutral faces more easily. We concluded that the self-concept in children relates to the ability to recognize emotions in others, particularly positive emotions. These findings provide some support to the simulation theory of social cognition, where children use their own self-representations to interpret mental states in others. The effect of the self-concept for intellectual status on emotion recognition might also indicate that intellectual abilities act as a mediator between self-concept and emotion recognition, but further studies are needed.


2020 ◽  
Vol 15 (12) ◽  
pp. 1336-1350
Author(s):  
Karolina I Rokita ◽  
Laurena Holleran ◽  
Maria R Dauvermann ◽  
David Mothersill ◽  
Jessica Holland ◽  
...  

Abstract Childhood trauma, and in particular physical neglect, has been repeatedly associated with lower performance on measures of social cognition (e.g. emotion recognition tasks) in both psychiatric and non-clinical populations. The neural mechanisms underpinning this association have remained unclear. Here, we investigated whether volumetric changes in three stress-sensitive regions—the amygdala, hippocampus and anterior cingulate cortex (ACC)—mediate the association between childhood trauma and emotion recognition in a healthy participant sample (N = 112) and a clinical sample of patients with schizophrenia (N = 46). Direct effects of childhood trauma, specifically physical neglect, on Emotion Recognition Task were observed in the whole sample. In healthy participants, reduced total and left ACC volumes were observed to fully mediate the association between both physical neglect and total childhood trauma score, and emotion recognition. No mediating effects of the hippocampus and amygdala volumes were observed for either group. These results suggest that reduced ACC volume may represent part of the mechanism by which early life adversity results in poorer social cognitive function. Confirmation of the causal basis of this association would highlight the importance of resilience-building interventions to mitigate the detrimental effects of childhood trauma on brain structure and function.


Sign in / Sign up

Export Citation Format

Share Document