Neurobehavioral correlates of impaired emotion recognition in pediatric PTSD

2021 ◽  
pp. 1-11
Author(s):  
Sara A. Heyn ◽  
Collin Schmit ◽  
Taylor J. Keding ◽  
Richard Wolf ◽  
Ryan J. Herringa

Abstract Despite broad evidence suggesting that adversity-exposed youth experience an impaired ability to recognize emotion in others, the underlying biological mechanisms remains elusive. This study uses a multimethod approach to target the neurological substrates of this phenomenon in a well-phenotyped sample of youth meeting diagnostic criteria for posttraumatic stress disorder (PTSD). Twenty-one PTSD-afflicted youth and 23 typically developing (TD) controls completed clinical interview schedules, an emotion recognition task with eye-tracking, and an implicit emotion processing task during functional magnetic resonance imaging )fMRI). PTSD was associated with decreased accuracy in identification of angry, disgust, and neutral faces as compared to TD youth. Of note, these impairments occurred despite the normal deployment of visual attention in youth with PTSD relative to TD youth. Correlation with a related fMRI task revealed a group by accuracy interaction for amygdala–hippocampus functional connectivity (FC) for angry expressions, where TD youth showed a positive relationship between anger accuracy and amygdala–hippocampus FC; this relationship was reversed in youth with PTSD. These findings are a novel characterization of impaired threat recognition within a well-phenotyped population of severe pediatric PTSD. Further, the differential amygdala–hippocampus FC identified in youth with PTSD may imply aberrant efficiency of emotional contextualization circuits.

2021 ◽  
pp. 147715352110026
Author(s):  
Y Mao ◽  
S Fotios

Obstacle detection and facial emotion recognition are two critical visual tasks for pedestrians. In previous studies, the effect of changes in lighting was tested for these as individual tasks, where the task to be performed next in a sequence was known. In natural situations, a pedestrian is required to attend to multiple tasks, perhaps simultaneously, or at least does not know which of several possible tasks would next require their attention. This multi-tasking might impair performance on any one task and affect evaluation of optimal lighting conditions. In two experiments, obstacle detection and facial emotion recognition tasks were performed in parallel under different illuminances. Comparison of these results with previous studies, where these same tasks were performed individually, suggests that multi-tasking impaired performance on the peripheral detection task but not the on-axis facial emotion recognition task.


2011 ◽  
Vol 198 (4) ◽  
pp. 302-308 ◽  
Author(s):  
Ian M. Anderson ◽  
Clare Shippen ◽  
Gabriella Juhasz ◽  
Diana Chase ◽  
Emma Thomas ◽  
...  

BackgroundNegative biases in emotional processing are well recognised in people who are currently depressed but are less well described in those with a history of depression, where such biases may contribute to vulnerability to relapse.AimsTo compare accuracy, discrimination and bias in face emotion recognition in those with current and remitted depression.MethodThe sample comprised a control group (n = 101), a currently depressed group (n = 30) and a remitted depression group (n = 99). Participants provided valid data after receiving a computerised face emotion recognition task following standardised assessment of diagnosis and mood symptoms.ResultsIn the control group women were more accurate in recognising emotions than men owing to greater discrimination. Among participants with depression, those in remission correctly identified more emotions than controls owing to increased response bias, whereas those currently depressed recognised fewer emotions owing to decreased discrimination. These effects were most marked for anger, fear and sadness but there was no significant emotion × group interaction, and a similar pattern tended to be seen for happiness although not for surprise or disgust. These differences were confined to participants who were antidepressant-free, with those taking antidepressants having similar results to the control group.ConclusionsAbnormalities in face emotion recognition differ between people with current depression and those in remission. Reduced discrimination in depressed participants may reflect withdrawal from the emotions of others, whereas the increased bias in those with a history of depression could contribute to vulnerability to relapse. The normal face emotion recognition seen in those taking medication may relate to the known effects of antidepressants on emotional processing and could contribute to their ability to protect against depressive relapse.


2014 ◽  
Vol 2014 ◽  
pp. 1-11 ◽  
Author(s):  
Dina Tell ◽  
Denise Davidson ◽  
Linda A. Camras

Eye gaze direction and expression intensity effects on emotion recognition in children with autism disorder and typically developing children were investigated. Children with autism disorder and typically developing children identified happy and angry expressions equally well. Children with autism disorder, however, were less accurate in identifying fear expressions across intensities and eye gaze directions. Children with autism disorder rated expressions with direct eyes, and 50% expressions, as more intense than typically developing children. A trend was also found for sad expressions, as children with autism disorder were less accurate in recognizing sadness at 100% intensity with direct eyes than typically developing children. Although the present research showed that children with autism disorder are sensitive to eye gaze direction, impairments in the recognition of fear, and possibly sadness, exist. Furthermore, children with autism disorder and typically developing children perceive the intensity of emotional expressions differently.


2022 ◽  
Vol 12 (2) ◽  
pp. 807
Author(s):  
Huafei Xiao ◽  
Wenbo Li ◽  
Guanzhong Zeng ◽  
Yingzhang Wu ◽  
Jiyong Xue ◽  
...  

With the development of intelligent automotive human-machine systems, driver emotion detection and recognition has become an emerging research topic. Facial expression-based emotion recognition approaches have achieved outstanding results on laboratory-controlled data. However, these studies cannot represent the environment of real driving situations. In order to address this, this paper proposes a facial expression-based on-road driver emotion recognition network called FERDERnet. This method divides the on-road driver facial expression recognition task into three modules: a face detection module that detects the driver’s face, an augmentation-based resampling module that performs data augmentation and resampling, and an emotion recognition module that adopts a deep convolutional neural network pre-trained on FER and CK+ datasets and then fine-tuned as a backbone for driver emotion recognition. This method adopts five different backbone networks as well as an ensemble method. Furthermore, to evaluate the proposed method, this paper collected an on-road driver facial expression dataset, which contains various road scenarios and the corresponding driver’s facial expression during the driving task. Experiments were performed on the on-road driver facial expression dataset that this paper collected. Based on efficiency and accuracy, the proposed FERDERnet with Xception backbone was effective in identifying on-road driver facial expressions and obtained superior performance compared to the baseline networks and some state-of-the-art networks.


2011 ◽  
Vol 42 (2) ◽  
pp. 419-426 ◽  
Author(s):  
E. Wingbermühle ◽  
J. I. M. Egger ◽  
W. M. A. Verhoeven ◽  
I. van der Burgt ◽  
R. P. C. Kessels

BackgroundNoonan syndrome (NS) is a common genetic disorder, characterized by short stature, facial dysmorphia, congenital heart defects and a mildly lowered IQ. Impairments in psychosocial functioning have often been suggested, without, however, systematic investigation in a clinical group. In this study, different aspects of affective processing, social cognition and behaviour, in addition to personal well-being, were assessed in a large group of patients with NS.MethodForty adult patients with NS were compared with 40 healthy controls, matched with respect to age, sex, intelligence and education level. Facial emotion recognition was measured with the Emotion Recognition Task (ERT), alexithymia with both the 20-item Toronto Alexithymia Scale (TAS-20) and the Bermond–Vorst Alexithymia Questionnaire (BVAQ), and mentalizing with the Theory of Mind (ToM) test. The Symptom Checklist-90 Revised (SCL-90-R) and the Scale for Interpersonal Behaviour (SIB) were used to record aspects of psychological well-being and social interaction.ResultsPatients showed higher levels of cognitive alexithymia than controls. They also experienced more social distress, but the frequency of engaging in social situations did not differ. Facial emotion recognition was only slightly impaired.ConclusionsHigher levels of alexithymia and social discomfort are part of the behavioural phenotype of NS. However, patients with NS have relatively intact perception of emotions in others and unimpaired mentalizing. These results provide insight into the underlying mechanisms of social daily life functioning in this patient group.


2018 ◽  
Vol 84 (2) ◽  
pp. 296-305 ◽  
Author(s):  
Stéphanie Val ◽  
Marian Poley ◽  
Krueger Anna ◽  
Gustavo Nino ◽  
Kristy Brown ◽  
...  

2021 ◽  
Author(s):  
Lisa S. Furlong ◽  
Susan L. Rossell ◽  
James A. Karantonis ◽  
Vanessa L. Cropley ◽  
Matthew Hughes ◽  
...  

2021 ◽  
Author(s):  
Maxime Montembeault ◽  
Estefania Brando ◽  
Kim Charest ◽  
Alexandra Tremblay ◽  
Élaine Roger ◽  
...  

Background. Studies suggest that emotion recognition and empathy are impaired in patients with MS (pwMS). Nonetheless, most studies of emotion recognition have used facial stimuli, are restricted to young samples, and rely self-report assessments of empathy. The aims of this study are to determine the impact of MS and age on multimodal emotion recognition (facial emotions and vocal emotional bursts) and on socioemotional sensitivity (as reported by the participants and their informants). We also aim to investigate the associations between emotion recognition, socioemotional sensitivity, and cognitive measures. Methods. We recruited 13 young healthy controls (HC), 14 young pwMS, 14 elderly HC and 15 elderly pwMS. They underwent a short neuropsychological battery, an experimental emotion recognition task including facial emotions and vocal emotional bursts. Both participants and their study informants completed the Revised-Self Monitoring Scale (RSMS) to assess the participant’s socioemotional sensitivity. Results. There was a significant effect of age and group on recognition of both facial emotions and emotional vocal bursts, HC performing significantly better than pwMS, and young participants performing better than elderly participants (no interaction effect). The same effects were observed on self-reported socioemotional sensitivity. However, lower socioemotional sensitivity in pwMS was not reported by the informants. Finally, multimodal emotion recognition did not correlate with socioemotional sensitivity, but it correlated with global cognitive severity. Conclusion. PwMS present with multimodal emotion perception deficits. Our results extend previous findings of decreased emotion perception and empathy to a group of elderly pwMS, in which advancing age does not accentuate these deficits. However, the decreased socioemotional sensitivity reported by pwMS does not appear to be observed by their relatives, nor to correlate with their emotion perception impairments. Future studies should investigate the real-life impacts of emotion perception deficits in pwMS.


2019 ◽  
Vol 10 (1) ◽  
Author(s):  
Florina Uzefovsky ◽  
Richard A. I. Bethlehem ◽  
Simone Shamay-Tsoory ◽  
Amber Ruigrok ◽  
Rosemary Holt ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document