scholarly journals Differences Between Autistic and Non-Autistic Adults in the Recognition of Anger from Facial Motion Remain after Controlling for Alexithymia

Author(s):  
Connor T. Keating ◽  
Dagmar S. Fraser ◽  
Sophie Sowden ◽  
Jennifer L. Cook

AbstractTo date, studies have not established whether autistic and non-autistic individuals differ in emotion recognition from facial motion cues when matched in terms of alexithymia. Here, autistic and non-autistic adults (N = 60) matched on age, gender, non-verbal reasoning ability and alexithymia, completed an emotion recognition task, which employed dynamic point light displays of emotional facial expressions manipulated in terms of speed and spatial exaggeration. Autistic participants exhibited significantly lower accuracy for angry, but not happy or sad, facial motion with unmanipulated speed and spatial exaggeration. Autistic, and not alexithymic, traits were predictive of accuracy for angry facial motion with unmanipulated speed and spatial exaggeration. Alexithymic traits, in contrast, were predictive of the magnitude of both correct and incorrect emotion ratings.

2020 ◽  
Author(s):  
Connor Tom Keating ◽  
Sophie L Sowden ◽  
Dagmar S Fraser ◽  
Jennifer L Cook

Abstract A burgeoning literature suggests that alexithymia, and not autism, is responsible for the difficulties with static emotion recognition that are documented in the autistic population. Here we investigate whether alexithymia can also account for difficulties with dynamic facial expressions. Autistic and control adults (N=60) matched on age, gender, non-verbal reasoning ability and alexithymia, completed an emotion recognition task, which employed dynamic point light displays of emotional facial expressions that varied in speed and spatial exaggeration. The ASD group exhibited significantly lower recognition accuracy for angry, but not happy or sad, expressions with normal speed and spatial exaggeration. The level of autistic, and not alexithymic, traits was a significant predictor of accuracy for angry expressions with normal speed and spatial exaggeration.


2020 ◽  
Author(s):  
Connor Tom Keating ◽  
Sophie L Sowden ◽  
Dagmar S Fraser ◽  
Jennifer L Cook

Abstract A burgeoning literature suggests that alexithymia, and not autism, is responsible for the difficulties with static emotion recognition that are documented in the autistic population. Here we investigate whether alexithymia can also account for difficulties with dynamic facial expressions. Autistic and control adults (N=60) matched on age, gender, non-verbal reasoning ability and alexithymia, completed an emotion recognition task, which employed dynamic point light displays of emotional facial expressions that varied in speed and spatial exaggeration. The ASD group exhibited significantly lower recognition accuracy for angry, but not happy or sad, expressions with normal speed and spatial exaggeration. The level of autistic, and not alexithymic, traits was a significant predictor of accuracy for angry expressions with normal speed and spatial exaggeration.


2021 ◽  
Author(s):  
Kai Klepzig ◽  
Julia Wendt ◽  
Bettina Sarnowski ◽  
Alfons O. Hamm ◽  
Martin Lotze

Abstract Single case studies about patients with unilateral insular lesions reported deficits in emotion recognition from facial expressions. However, there is no consensus about both the actual extent of impairments and the role of lesion lateralization. To investigate associations of brain lesions and impairments in a facial emotion recognition task, we used voxel-based lesion-symptom mapping (VLSM) in a group of 29 stroke patients in the chronic stage, 16 with left and 13 with right hemispheric lesion. Recognition accuracy was impaired for fearful and angry expressions in patients with left hemispheric lesions compared to 14 matched healthy controls. VLSM analyses revealed that lesions centered around the left insula were associated with impaired recognition of emotional facial expressions. We here demonstrate a critical role for the left insula in decoding unpleasant emotions from facial expressions and therefore present further evidence for a broader role for the insular cortex not restricted to disgust processing.


2020 ◽  
Author(s):  
Connor Tom Keating ◽  
Sophie L Sowden ◽  
Dagmar S Fraser ◽  
Jennifer L Cook

Abstract BackgroundFor many years, research has suggested that autistic individuals have difficulties recognising the emotions of other people. However, a burgeoning literature argues that these difficulties may be better explained by co-occurring alexithymia rather than autistic characteristics. Importantly, extant studies in this field have focused on the recognition of emotion from static images. Here we investigated whether there are differences with respect to emotion recognition from dynamic facial stimuli between autistic and non-autistic groups matched on alexithymia.Methods29 control and 31 autistic adults, matched on age, gender, non-verbal reasoning ability and alexithymia, completed a facial emotion recognition task which employed dynamic point light displays of happy, angry and sad facial expressions. Stimuli were manipulated such that expressions were reproduced at 50%, 100% and 150% of their normal speed and spatial extent. ResultsThe ASD group exhibited significantly lower emotion recognition accuracy for angry, but not happy or sad, expressions at the normal (100%) spatial and speed level. Whilst the control group exhibited increasing accuracy across all levels of the speed manipulation, the ASD group only showed improvement from the 100% to 150% level. Non-verbal reasoning and level of autistic traits (and not age, gender or alexithymia) were significant predictors of accuracy for angry videos at the 100% spatial and speed level. LimitationsDue to COVID-19 restrictions, only 22 members of the ASD group completed the ADOS-2 assessments and 7 of those who did, scored below threshold for an autism or ASD diagnosis. Therefore, our ASD group may display less frequent or lower intensity autistic behaviours than would typically be seen in an ASD population. The TAS, which has recently been questioned for its construct validity, was used to measure alexithymia. ConclusionsSince our participants were matched on alexithymia, and we identified that level of autistic traits (and not alexithymic traits) was a significant predictor of the accuracy of angry expression recognition at the normal level, we conclude that a difficulty with recognising angry expressions is relevant to autism and cannot be explained by alexithymia. Future research should elucidate why autistic individuals exhibit differences with angry expressions in particular.


2017 ◽  
Vol 29 (5) ◽  
pp. 1749-1761 ◽  
Author(s):  
Johanna Bick ◽  
Rhiannon Luyster ◽  
Nathan A. Fox ◽  
Charles H. Zeanah ◽  
Charles A. Nelson

AbstractWe examined facial emotion recognition in 12-year-olds in a longitudinally followed sample of children with and without exposure to early life psychosocial deprivation (institutional care). Half of the institutionally reared children were randomized into foster care homes during the first years of life. Facial emotion recognition was examined in a behavioral task using morphed images. This same task had been administered when children were 8 years old. Neutral facial expressions were morphed with happy, sad, angry, and fearful emotional facial expressions, and children were asked to identify the emotion of each face, which varied in intensity. Consistent with our previous report, we show that some areas of emotion processing, involving the recognition of happy and fearful faces, are affected by early deprivation, whereas other areas, involving the recognition of sad and angry faces, appear to be unaffected. We also show that early intervention can have a lasting positive impact, normalizing developmental trajectories of processing negative emotions (fear) into the late childhood/preadolescent period.


Author(s):  
Izabela Krejtz ◽  
Krzysztof Krejtz ◽  
Katarzyna Wisiecka ◽  
Marta Abramczyk ◽  
Michał Olszanowski ◽  
...  

Abstract The enhancement hypothesis suggests that deaf individuals are more vigilant to visual emotional cues than hearing individuals. The present eye-tracking study examined ambient–focal visual attention when encoding affect from dynamically changing emotional facial expressions. Deaf (n = 17) and hearing (n = 17) individuals watched emotional facial expressions that in 10-s animations morphed from a neutral expression to one of happiness, sadness, or anger. The task was to recognize emotion as quickly as possible. Deaf participants tended to be faster than hearing participants in affect recognition, but the groups did not differ in accuracy. In general, happy faces were more accurately and more quickly recognized than faces expressing anger or sadness. Both groups demonstrated longer average fixation duration when recognizing happiness in comparison to anger and sadness. Deaf individuals directed their first fixations less often to the mouth region than the hearing group. During the last stages of emotion recognition, deaf participants exhibited more focal viewing of happy faces than negative faces. This pattern was not observed among hearing individuals. The analysis of visual gaze dynamics, switching between ambient and focal attention, was useful in studying the depth of cognitive processing of emotional information among deaf and hearing individuals.


2013 ◽  
Vol 16 ◽  
Author(s):  
Esther Lázaro ◽  
Imanol Amayra ◽  
Juan Francisco López-Paz ◽  
Amaia Jometón ◽  
Natalia Martín ◽  
...  

AbstractThe assessment of facial expression is an important aspect of a clinical neurological examination, both as an indicator of a mood disorder and as a sign of neurological damage. To date, although studies have been conducted on certain psychosocial aspects of myasthenia, such as quality of life and anxiety, and on neuropsychological aspects such as memory, no studies have directly assessed facial emotion recognition accuracy. The aim of this study was to assess the facial emotion recognition accuracy (fear, surprise, sadness, happiness, anger, and disgust), empathy, and reaction time of patients with myasthenia. Thirty-five patients with myasthenia and 36 healthy controls were tested for their ability to differentiate emotional facial expressions. Participants were matched with respect to age, gender, and education level. Their ability to differentiate emotional facial expressions was evaluated using the computer-based program Feel Test. The data showed that myasthenic patients scored significantly lower (p < 0.05) than healthy controls in the total Feel score, fear, surprise, and higher reaction time. The findings suggest that the ability to recognize facial affect may be reduced in individuals with myasthenia.


2013 ◽  
Vol 8 (1) ◽  
pp. 75-93 ◽  
Author(s):  
Roy P.C. Kessels ◽  
Barbara Montagne ◽  
Angelique W. Hendriks ◽  
David I. Perrett ◽  
Edward H.F. de Haan

2021 ◽  
Vol 12 ◽  
Author(s):  
Agnes Bohne ◽  
Dag Nordahl ◽  
Åsne A. W. Lindahl ◽  
Pål Ulvenes ◽  
Catharina E. A. Wang ◽  
...  

Processing of emotional facial expressions is of great importance in interpersonal relationships. Aberrant engagement with facial expressions, particularly an engagement with sad faces, loss of engagement with happy faces, and enhanced memory of sadness has been found in depression. Since most studies used adult faces, we here examined if such biases also occur in processing of infant faces in those with depression or depressive symptoms. In study 1, we recruited 25 inpatient women with major depression and 25 matched controls. In study 2, we extracted a sample of expecting parents from the NorBaby study, where 29 reported elevated levels of depressive symptoms, and 29 were matched controls. In both studies, we assessed attentional bias with a dot-probe task using happy, sad and neutral infant faces, and facial memory bias with a recognition task using happy, sad, angry, afraid, surprised, disgusted and neutral infant and adult faces. Participants also completed the Ruminative Responses Scale and Becks Depression Inventory-II. In study 1, we found no group difference in either attention to or memory accuracy for emotional infant faces. Neither attention nor recognition was associated with rumination. In study 2, we found that the group with depressive symptoms disengaged more slowly than healthy controls from sad infant faces, and this was related to rumination. The results place emphasis on the importance of emotional self-relevant material when examining cognitive processing in depression. Together, these studies demonstrate that a mood-congruent attentional bias to infant faces is present in expecting parents with depressive symptoms, but not in inpatients with Major Depression Disorder who do not have younger children.


2018 ◽  
Vol 8 (12) ◽  
pp. 219 ◽  
Author(s):  
Mayra Gutiérrez-Muñoz ◽  
Martha Fajardo-Araujo ◽  
Erika González-Pérez ◽  
Victor Aguirre-Arzola ◽  
Silvia Solís-Ortiz

Polymorphisms of the estrogen receptor ESR1 and ESR2 genes have been linked with cognitive deficits and affective disorders. The effects of these genetic variants on emotional processing in females with low estrogen levels are not well known. The aim was to explore the impact of the ESR1 and ESR2 genes on the responses to the facial emotion recognition task in females. Postmenopausal healthy female volunteers were genotyped for the polymorphisms Xbal and PvuII of ESR1 and the polymorphism rs1256030 of ESR2. The effect of these polymorphisms on the response to the facial emotion recognition of the emotions happiness, sadness, disgust, anger, surprise, and fear was analyzed. Females carrying the P allele of the PvuII polymorphism or the X allele of the Xbal polymorphism of ESR1 easily recognized facial expressions of sadness that were more difficult for the women carrying the p allele or the x allele. They displayed higher accuracy, fast response time, more correct responses, and fewer omissions to complete the task, with a large effect size. Women carrying the ESR2 C allele of ESR2 showed a faster response time for recognizing facial expressions of anger. These findings link ESR1 and ESR2 polymorphisms in facial emotion recognition of negative emotions.


Sign in / Sign up

Export Citation Format

Share Document