emotion perception
Recently Published Documents


TOTAL DOCUMENTS

557
(FIVE YEARS 169)

H-INDEX

48
(FIVE YEARS 4)

2022 ◽  
Vol 96 ◽  
pp. 104406
Author(s):  
Tao Yang ◽  
Lulu Zhang ◽  
Guangzheng Xu ◽  
Zeyun Yang ◽  
Yifan Luo ◽  
...  

Languages ◽  
2022 ◽  
Vol 7 (1) ◽  
pp. 12
Author(s):  
Peiyao Chen ◽  
Ashley Chung-Fat-Yim ◽  
Viorica Marian

Emotion perception frequently involves the integration of visual and auditory information. During multisensory emotion perception, the attention devoted to each modality can be measured by calculating the difference between trials in which the facial expression and speech input exhibit the same emotion (congruent) and trials in which the facial expression and speech input exhibit different emotions (incongruent) to determine the modality that has the strongest influence. Previous cross-cultural studies have found that individuals from Western cultures are more distracted by information in the visual modality (i.e., visual interference), whereas individuals from Eastern cultures are more distracted by information in the auditory modality (i.e., auditory interference). These results suggest that culture shapes modality interference in multisensory emotion perception. It is unclear, however, how emotion perception is influenced by cultural immersion and exposure due to migration to a new country with distinct social norms. In the present study, we investigated how the amount of daily exposure to a new culture and the length of immersion impact multisensory emotion perception in Chinese-English bilinguals who moved from China to the United States. In an emotion recognition task, participants viewed facial expressions and heard emotional but meaningless speech either from their previous Eastern culture (i.e., Asian face-Mandarin speech) or from their new Western culture (i.e., Caucasian face-English speech) and were asked to identify the emotion from either the face or voice, while ignoring the other modality. Analyses of daily cultural exposure revealed that bilinguals with low daily exposure to the U.S. culture experienced greater interference from the auditory modality, whereas bilinguals with high daily exposure to the U.S. culture experienced greater interference from the visual modality. These results demonstrate that everyday exposure to new cultural norms increases the likelihood of showing a modality interference pattern that is more common in the new culture. Analyses of immersion duration revealed that bilinguals who spent more time in the United States were equally distracted by faces and voices, whereas bilinguals who spent less time in the United States experienced greater visual interference when evaluating emotional information from the West, possibly due to over-compensation when evaluating emotional information from the less familiar culture. These findings suggest that the amount of daily exposure to a new culture and length of cultural immersion influence multisensory emotion perception in bilingual immigrants. While increased daily exposure to the new culture aids with the adaptation to new cultural norms, increased length of cultural immersion leads to similar patterns in modality interference between the old and new cultures. We conclude that cultural experience shapes the way we perceive and evaluate the emotions of others.


Author(s):  
Maxime Montembeault ◽  
Estefania Brando ◽  
Kim Charest ◽  
Alexandra Tremblay ◽  
Élaine Roger ◽  
...  

2021 ◽  
Vol 3 ◽  
Author(s):  
Jingyao Wu ◽  
Ting Dang ◽  
Vidhyasaharan Sethu ◽  
Eliathamby Ambikairajah

People perceive emotions via multiple cues, predominantly speech and visual cues, and a number of emotion recognition systems utilize both audio and visual cues. Moreover, the perception of static aspects of emotion (speaker's arousal level is high/low) and the dynamic aspects of emotion (speaker is becoming more aroused) might be perceived via different expressive cues and these two aspects are integrated to provide a unified sense of emotion state. However, existing multimodal systems only focus on single aspect of emotion perception and the contributions of different modalities toward modeling static and dynamic emotion aspects are not well explored. In this paper, we investigate the relative salience of audio and video modalities to emotion state prediction and emotion change prediction using a Multimodal Markovian affect model. Experiments conducted in the RECOLA database showed that audio modality is better at modeling the emotion state of arousal and video for emotion state of valence, whereas audio shows superior advantages over video in modeling emotion changes for both arousal and valence.


2021 ◽  
Vol 295 ◽  
pp. 717-723
Author(s):  
Amy T. Peters ◽  
Xinguo Ren ◽  
Katie L. Bessette ◽  
Nevita George ◽  
Leah R. Kling ◽  
...  

2021 ◽  
Vol 162 ◽  
pp. 108052
Author(s):  
Nicholas E. Souter ◽  
Kristen A. Lindquist ◽  
Elizabeth Jefferies
Keyword(s):  

PLoS ONE ◽  
2021 ◽  
Vol 16 (10) ◽  
pp. e0258470
Author(s):  
Elyssa M. Barrick ◽  
Mark A. Thornton ◽  
Diana I. Tamir

Faces are one of the key ways that we obtain social information about others. They allow people to identify individuals, understand conversational cues, and make judgements about others’ mental states. When the COVID-19 pandemic hit the United States, widespread mask-wearing practices were implemented, causing a shift in the way Americans typically interact. This introduction of masks into social exchanges posed a potential challenge—how would people make these important inferences about others when a large source of information was no longer available? We conducted two studies that investigated the impact of mask exposure on emotion perception. In particular, we measured how participants used facial landmarks (visual cues) and the expressed valence and arousal (affective cues), to make similarity judgements about pairs of emotion faces. Study 1 found that in August 2020, participants with higher levels of mask exposure used cues from the eyes to a greater extent when judging emotion similarity than participants with less mask exposure. Study 2 measured participants’ emotion perception in both April and September 2020 –before and after widespread mask adoption—in the same group of participants to examine changes in the use of facial cues over time. Results revealed an overall increase in the use of visual cues from April to September. Further, as mask exposure increased, people with the most social interaction showed the largest increase in the use of visual facial cues. These results provide evidence that a shift has occurred in how people process faces such that the more people are interacting with others that are wearing masks, the more they have learned to focus on visual cues from the eye area of the face.


Sign in / Sign up

Export Citation Format

Share Document