Hybrid Fusion Based Approach for Multimodal Emotion Recognition with Insufficient Labeled Data

Author(s):  
Puneet Kumar ◽  
Vedanti Khokher ◽  
Yukti Gupta ◽  
Balasubramanian Raman
IEEE Access ◽  
2020 ◽  
Vol 8 ◽  
pp. 168865-168878
Author(s):  
Yucel Cimtay ◽  
Erhan Ekmekcioglu ◽  
Seyma Caglar-Ozhan

2021 ◽  
Vol 25 (4) ◽  
pp. 1031-1045
Author(s):  
Helang Lai ◽  
Keke Wu ◽  
Lingli Li

Emotion recognition in conversations is crucial as there is an urgent need to improve the overall experience of human-computer interactions. A promising improvement in this field is to develop a model that can effectively extract adequate contexts of a test utterance. We introduce a novel model, termed hierarchical memory networks (HMN), to address the issues of recognizing utterance level emotions. HMN divides the contexts into different aspects and employs different step lengths to represent the weights of these aspects. To model the self dependencies, HMN takes independent local memory networks to model these aspects. Further, to capture the interpersonal dependencies, HMN employs global memory networks to integrate the local outputs into global storages. Such storages can generate contextual summaries and help to find the emotional dependent utterance that is most relevant to the test utterance. With an attention-based multi-hops scheme, these storages are then merged with the test utterance using an addition operation in the iterations. Experiments on the IEMOCAP dataset show our model outperforms the compared methods with accuracy improvement.


2018 ◽  
Vol E101.D (8) ◽  
pp. 2092-2100
Author(s):  
Nurul LUBIS ◽  
Dessi LESTARI ◽  
Sakriani SAKTI ◽  
Ayu PURWARIANTI ◽  
Satoshi NAKAMURA

2021 ◽  
Author(s):  
Maxime Montembeault ◽  
Estefania Brando ◽  
Kim Charest ◽  
Alexandra Tremblay ◽  
Élaine Roger ◽  
...  

Background. Studies suggest that emotion recognition and empathy are impaired in patients with MS (pwMS). Nonetheless, most studies of emotion recognition have used facial stimuli, are restricted to young samples, and rely self-report assessments of empathy. The aims of this study are to determine the impact of MS and age on multimodal emotion recognition (facial emotions and vocal emotional bursts) and on socioemotional sensitivity (as reported by the participants and their informants). We also aim to investigate the associations between emotion recognition, socioemotional sensitivity, and cognitive measures. Methods. We recruited 13 young healthy controls (HC), 14 young pwMS, 14 elderly HC and 15 elderly pwMS. They underwent a short neuropsychological battery, an experimental emotion recognition task including facial emotions and vocal emotional bursts. Both participants and their study informants completed the Revised-Self Monitoring Scale (RSMS) to assess the participant’s socioemotional sensitivity. Results. There was a significant effect of age and group on recognition of both facial emotions and emotional vocal bursts, HC performing significantly better than pwMS, and young participants performing better than elderly participants (no interaction effect). The same effects were observed on self-reported socioemotional sensitivity. However, lower socioemotional sensitivity in pwMS was not reported by the informants. Finally, multimodal emotion recognition did not correlate with socioemotional sensitivity, but it correlated with global cognitive severity. Conclusion. PwMS present with multimodal emotion perception deficits. Our results extend previous findings of decreased emotion perception and empathy to a group of elderly pwMS, in which advancing age does not accentuate these deficits. However, the decreased socioemotional sensitivity reported by pwMS does not appear to be observed by their relatives, nor to correlate with their emotion perception impairments. Future studies should investigate the real-life impacts of emotion perception deficits in pwMS.


Sign in / Sign up

Export Citation Format

Share Document