scholarly journals Hierarchical Network with Label Embedding for Contextual Emotion Recognition

Research ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Jiawen Deng ◽  
Fuji Ren

Emotion recognition has been used widely in various applications such as mental health monitoring and emotional management. Usually, emotion recognition is regarded as a text classification task. Emotion recognition is a more complex problem, and the relations of emotions expressed in a text are nonnegligible. In this paper, a hierarchical model with label embedding is proposed for contextual emotion recognition. Especially, a hierarchical model is utilized to learn the emotional representation of a given sentence based on its contextual information. To give emotion correlation-based recognition, a label embedding matrix is trained by joint learning, which contributes to the final prediction. Comparison experiments are conducted on Chinese emotional corpus RenCECps, and the experimental results indicate that our approach has a satisfying performance in textual emotion recognition task.

2021 ◽  
pp. 147715352110026
Author(s):  
Y Mao ◽  
S Fotios

Obstacle detection and facial emotion recognition are two critical visual tasks for pedestrians. In previous studies, the effect of changes in lighting was tested for these as individual tasks, where the task to be performed next in a sequence was known. In natural situations, a pedestrian is required to attend to multiple tasks, perhaps simultaneously, or at least does not know which of several possible tasks would next require their attention. This multi-tasking might impair performance on any one task and affect evaluation of optimal lighting conditions. In two experiments, obstacle detection and facial emotion recognition tasks were performed in parallel under different illuminances. Comparison of these results with previous studies, where these same tasks were performed individually, suggests that multi-tasking impaired performance on the peripheral detection task but not the on-axis facial emotion recognition task.


2021 ◽  
pp. 1-11
Author(s):  
Sara A. Heyn ◽  
Collin Schmit ◽  
Taylor J. Keding ◽  
Richard Wolf ◽  
Ryan J. Herringa

Abstract Despite broad evidence suggesting that adversity-exposed youth experience an impaired ability to recognize emotion in others, the underlying biological mechanisms remains elusive. This study uses a multimethod approach to target the neurological substrates of this phenomenon in a well-phenotyped sample of youth meeting diagnostic criteria for posttraumatic stress disorder (PTSD). Twenty-one PTSD-afflicted youth and 23 typically developing (TD) controls completed clinical interview schedules, an emotion recognition task with eye-tracking, and an implicit emotion processing task during functional magnetic resonance imaging )fMRI). PTSD was associated with decreased accuracy in identification of angry, disgust, and neutral faces as compared to TD youth. Of note, these impairments occurred despite the normal deployment of visual attention in youth with PTSD relative to TD youth. Correlation with a related fMRI task revealed a group by accuracy interaction for amygdala–hippocampus functional connectivity (FC) for angry expressions, where TD youth showed a positive relationship between anger accuracy and amygdala–hippocampus FC; this relationship was reversed in youth with PTSD. These findings are a novel characterization of impaired threat recognition within a well-phenotyped population of severe pediatric PTSD. Further, the differential amygdala–hippocampus FC identified in youth with PTSD may imply aberrant efficiency of emotional contextualization circuits.


2011 ◽  
Vol 198 (4) ◽  
pp. 302-308 ◽  
Author(s):  
Ian M. Anderson ◽  
Clare Shippen ◽  
Gabriella Juhasz ◽  
Diana Chase ◽  
Emma Thomas ◽  
...  

BackgroundNegative biases in emotional processing are well recognised in people who are currently depressed but are less well described in those with a history of depression, where such biases may contribute to vulnerability to relapse.AimsTo compare accuracy, discrimination and bias in face emotion recognition in those with current and remitted depression.MethodThe sample comprised a control group (n = 101), a currently depressed group (n = 30) and a remitted depression group (n = 99). Participants provided valid data after receiving a computerised face emotion recognition task following standardised assessment of diagnosis and mood symptoms.ResultsIn the control group women were more accurate in recognising emotions than men owing to greater discrimination. Among participants with depression, those in remission correctly identified more emotions than controls owing to increased response bias, whereas those currently depressed recognised fewer emotions owing to decreased discrimination. These effects were most marked for anger, fear and sadness but there was no significant emotion × group interaction, and a similar pattern tended to be seen for happiness although not for surprise or disgust. These differences were confined to participants who were antidepressant-free, with those taking antidepressants having similar results to the control group.ConclusionsAbnormalities in face emotion recognition differ between people with current depression and those in remission. Reduced discrimination in depressed participants may reflect withdrawal from the emotions of others, whereas the increased bias in those with a history of depression could contribute to vulnerability to relapse. The normal face emotion recognition seen in those taking medication may relate to the known effects of antidepressants on emotional processing and could contribute to their ability to protect against depressive relapse.


2022 ◽  
Vol 12 (2) ◽  
pp. 807
Author(s):  
Huafei Xiao ◽  
Wenbo Li ◽  
Guanzhong Zeng ◽  
Yingzhang Wu ◽  
Jiyong Xue ◽  
...  

With the development of intelligent automotive human-machine systems, driver emotion detection and recognition has become an emerging research topic. Facial expression-based emotion recognition approaches have achieved outstanding results on laboratory-controlled data. However, these studies cannot represent the environment of real driving situations. In order to address this, this paper proposes a facial expression-based on-road driver emotion recognition network called FERDERnet. This method divides the on-road driver facial expression recognition task into three modules: a face detection module that detects the driver’s face, an augmentation-based resampling module that performs data augmentation and resampling, and an emotion recognition module that adopts a deep convolutional neural network pre-trained on FER and CK+ datasets and then fine-tuned as a backbone for driver emotion recognition. This method adopts five different backbone networks as well as an ensemble method. Furthermore, to evaluate the proposed method, this paper collected an on-road driver facial expression dataset, which contains various road scenarios and the corresponding driver’s facial expression during the driving task. Experiments were performed on the on-road driver facial expression dataset that this paper collected. Based on efficiency and accuracy, the proposed FERDERnet with Xception backbone was effective in identifying on-road driver facial expressions and obtained superior performance compared to the baseline networks and some state-of-the-art networks.


2011 ◽  
Vol 42 (2) ◽  
pp. 419-426 ◽  
Author(s):  
E. Wingbermühle ◽  
J. I. M. Egger ◽  
W. M. A. Verhoeven ◽  
I. van der Burgt ◽  
R. P. C. Kessels

BackgroundNoonan syndrome (NS) is a common genetic disorder, characterized by short stature, facial dysmorphia, congenital heart defects and a mildly lowered IQ. Impairments in psychosocial functioning have often been suggested, without, however, systematic investigation in a clinical group. In this study, different aspects of affective processing, social cognition and behaviour, in addition to personal well-being, were assessed in a large group of patients with NS.MethodForty adult patients with NS were compared with 40 healthy controls, matched with respect to age, sex, intelligence and education level. Facial emotion recognition was measured with the Emotion Recognition Task (ERT), alexithymia with both the 20-item Toronto Alexithymia Scale (TAS-20) and the Bermond–Vorst Alexithymia Questionnaire (BVAQ), and mentalizing with the Theory of Mind (ToM) test. The Symptom Checklist-90 Revised (SCL-90-R) and the Scale for Interpersonal Behaviour (SIB) were used to record aspects of psychological well-being and social interaction.ResultsPatients showed higher levels of cognitive alexithymia than controls. They also experienced more social distress, but the frequency of engaging in social situations did not differ. Facial emotion recognition was only slightly impaired.ConclusionsHigher levels of alexithymia and social discomfort are part of the behavioural phenotype of NS. However, patients with NS have relatively intact perception of emotions in others and unimpaired mentalizing. These results provide insight into the underlying mechanisms of social daily life functioning in this patient group.


2021 ◽  
Author(s):  
Maxime Montembeault ◽  
Estefania Brando ◽  
Kim Charest ◽  
Alexandra Tremblay ◽  
Élaine Roger ◽  
...  

Background. Studies suggest that emotion recognition and empathy are impaired in patients with MS (pwMS). Nonetheless, most studies of emotion recognition have used facial stimuli, are restricted to young samples, and rely self-report assessments of empathy. The aims of this study are to determine the impact of MS and age on multimodal emotion recognition (facial emotions and vocal emotional bursts) and on socioemotional sensitivity (as reported by the participants and their informants). We also aim to investigate the associations between emotion recognition, socioemotional sensitivity, and cognitive measures. Methods. We recruited 13 young healthy controls (HC), 14 young pwMS, 14 elderly HC and 15 elderly pwMS. They underwent a short neuropsychological battery, an experimental emotion recognition task including facial emotions and vocal emotional bursts. Both participants and their study informants completed the Revised-Self Monitoring Scale (RSMS) to assess the participant’s socioemotional sensitivity. Results. There was a significant effect of age and group on recognition of both facial emotions and emotional vocal bursts, HC performing significantly better than pwMS, and young participants performing better than elderly participants (no interaction effect). The same effects were observed on self-reported socioemotional sensitivity. However, lower socioemotional sensitivity in pwMS was not reported by the informants. Finally, multimodal emotion recognition did not correlate with socioemotional sensitivity, but it correlated with global cognitive severity. Conclusion. PwMS present with multimodal emotion perception deficits. Our results extend previous findings of decreased emotion perception and empathy to a group of elderly pwMS, in which advancing age does not accentuate these deficits. However, the decreased socioemotional sensitivity reported by pwMS does not appear to be observed by their relatives, nor to correlate with their emotion perception impairments. Future studies should investigate the real-life impacts of emotion perception deficits in pwMS.


2019 ◽  
Vol 10 (1) ◽  
Author(s):  
Florina Uzefovsky ◽  
Richard A. I. Bethlehem ◽  
Simone Shamay-Tsoory ◽  
Amber Ruigrok ◽  
Rosemary Holt ◽  
...  

2019 ◽  
Vol 25 (08) ◽  
pp. 884-889 ◽  
Author(s):  
Sally A. Grace ◽  
Wei Lin Toh ◽  
Ben Buchanan ◽  
David J. Castle ◽  
Susan L. Rossell

Abstract Objectives: Patients with body dysmorphic disorder (BDD) have difficulty in recognising facial emotions, and there is evidence to suggest that there is a specific deficit in identifying negative facial emotions, such as sadness and anger. Methods: This study investigated facial emotion recognition in 19 individuals with BDD compared with 21 healthy control participants who completed a facial emotion recognition task, in which they were asked to identify emotional expressions portrayed in neutral, happy, sad, fearful, or angry faces. Results: Compared to the healthy control participants, the BDD patients were generally less accurate in identifying all facial emotions but showed specific deficits for negative emotions. The BDD group made significantly more errors when identifying neutral, angry, and sad faces than healthy controls; and were significantly slower at identifying neutral, angry, and happy faces. Conclusions: These findings add to previous face-processing literature in BDD, suggesting deficits in identifying negative facial emotions. There are treatment implications as future interventions would do well to target such deficits.


2020 ◽  
Vol 11 ◽  
Author(s):  
Elisabet Serrat ◽  
Anna Amadó ◽  
Carles Rostan ◽  
Beatriz Caparrós ◽  
Francesc Sidera

This study aims to further understand children’s capacity to identify and reason about pretend emotions by analyzing which sources of information they take into account when interpreting emotions simulated in pretend play contexts. A total of 79 children aged 3 to 8 participated in the final sample of the study. They were divided into the young group (ages 3 to 5) and the older group (6 to 8). The children were administered a facial emotion recognition task, a pretend emotions task, and a non-verbal cognitive ability test. In the pretend emotions task, the children were asked whether the protagonist of silent videos, who was displaying pretend emotions (pretend anger and pretend sadness), was displaying a real or a pretend emotion, and to justify their answer. The results show significant differences in the children’s capacity to identify and justify pretend emotions according to age and type of emotion. The data suggest that young children recognize pretend sadness, but have more difficulty detecting pretend anger. In addition, children seem to find facial information more useful for the detection of pretend sadness than pretend anger, and they more often interpret the emotional expression of the characters in terms of pretend play. The present research presents new data about the recognition of negative emotional expressions of sadness and anger and the type of information children take into account to justify their interpretation of pretend emotions, which consists not only in emotional expression but also contextual information.


Sign in / Sign up

Export Citation Format

Share Document