scholarly journals Dynamic human and avatar facial expressions elicit differential brain responses

2020 ◽  
Vol 15 (3) ◽  
pp. 303-317 ◽  
Author(s):  
Lorena C Kegel ◽  
Peter Brugger ◽  
Sascha Frühholz ◽  
Thomas Grunwald ◽  
Peter Hilfiker ◽  
...  

Abstract Computer-generated characters, so-called avatars, are widely used in advertising, entertainment, human–computer interaction or as research tools to investigate human emotion perception. However, brain responses to avatar and human faces have scarcely been studied to date. As such, it remains unclear whether dynamic facial expressions of avatars evoke different brain responses than dynamic facial expressions of humans. In this study, we designed anthropomorphic avatars animated with motion tracking and tested whether the human brain processes fearful and neutral expressions in human and avatar faces differently. Our fMRI results showed that fearful human expressions evoked stronger responses than fearful avatar expressions in the ventral anterior and posterior cingulate gyrus, the anterior insula, the anterior and posterior superior temporal sulcus, and the inferior frontal gyrus. Fearful expressions in human and avatar faces evoked similar responses in the amygdala. We did not find different responses to neutral human and avatar expressions. Our results highlight differences, but also similarities in the processing of fearful human expressions and fearful avatar expressions even if they are designed to be highly anthropomorphic and animated with motion tracking. This has important consequences for research using dynamic avatars, especially when processes are investigated that involve cortical and subcortical regions.

Pain ◽  
2006 ◽  
Vol 126 (1) ◽  
pp. 309-318 ◽  
Author(s):  
Daniela Simon ◽  
Kenneth D. Craig ◽  
Wolfgang H.R. Miltner ◽  
Pierre Rainville

2021 ◽  
Vol 15 ◽  
Author(s):  
Teresa Sollfrank ◽  
Oona Kohnen ◽  
Peter Hilfiker ◽  
Lorena C. Kegel ◽  
Hennric Jokeit ◽  
...  

This study aimed to examine whether the cortical processing of emotional faces is modulated by the computerization of face stimuli (”avatars”) in a group of 25 healthy participants. Subjects were passively viewing 128 static and dynamic facial expressions of female and male actors and their respective avatars in neutral or fearful conditions. Event-related potentials (ERPs), as well as alpha and theta event-related synchronization and desynchronization (ERD/ERS), were derived from the EEG that was recorded during the task. All ERP features, except for the very early N100, differed in their response to avatar and actor faces. Whereas the N170 showed differences only for the neutral avatar condition, later potentials (N300 and LPP) differed in both emotional conditions (neutral and fear) and the presented agents (actor and avatar). In addition, we found that the avatar faces elicited significantly stronger reactions than the actor face for theta and alpha oscillations. Especially theta EEG frequencies responded specifically to visual emotional stimulation and were revealed to be sensitive to the emotional content of the face, whereas alpha frequency was modulated by all the stimulus types. We can conclude that the computerized avatar faces affect both, ERP components and ERD/ERS and evoke neural effects that are different from the ones elicited by real faces. This was true, although the avatars were replicas of the human faces and contained similar characteristics in their expression.


2012 ◽  
Vol 24 (2) ◽  
pp. 507-520 ◽  
Author(s):  
Elaine Foley ◽  
Gina Rippon ◽  
Ngoc Jade Thai ◽  
Olivia Longe ◽  
Carl Senior

Very little is known about the neural structures involved in the perception of realistic dynamic facial expressions. In the present study, a unique set of naturalistic dynamic facial emotional expressions was created. Through fMRI and connectivity analysis, a dynamic face perception network was identified, which is demonstrated to extend Haxby et al.'s [Haxby, J. V., Hoffman, E. A., & Gobbini, M. I. The distributed human neural system for face perception. Trends in Cognitive Science, 4, 223–233, 2000] distributed neural system for face perception. This network includes early visual regions, such as the inferior occipital gyrus, which is identified as insensitive to motion or affect but sensitive to the visual stimulus, the STS, identified as specifically sensitive to motion, and the amygdala, recruited to process affect. Measures of effective connectivity between these regions revealed that dynamic facial stimuli were associated with specific increases in connectivity between early visual regions, such as the inferior occipital gyrus and the STS, along with coupling between the STS and the amygdala, as well as the inferior frontal gyrus. These findings support the presence of a distributed network of cortical regions that mediate the perception of different dynamic facial expressions.


i-Perception ◽  
2018 ◽  
Vol 9 (4) ◽  
pp. 204166951878652 ◽  
Author(s):  
Leonor Philip ◽  
Jean-Claude Martin ◽  
Céline Clavel

Facial expressions of emotion provide relevant cues for understanding social interactions and the affective processes involved in emotion perception. Virtual human faces are useful for conducting controlled experiments. However, little is known regarding the possible differences between physiological responses elicited by virtual versus real human facial expressions. The aim of the current study was to determine if virtual and real emotional faces elicit the same rapid facial reactions for the perception of facial expressions of joy, anger, and sadness. Facial electromyography (corrugator supercilii, zygomaticus major, and depressor anguli) was recorded in 30 participants during the presentation of dynamic or static and virtual or real faces. For the perception of dynamic facial expressions of joy and anger, analyses of electromyography data revealed that rapid facial reactions were stronger when participants were presented with real faces compared with virtual faces. These results suggest that the processes underlying the perception of virtual versus real emotional faces might differ.


Author(s):  
DONGXUE QIN ◽  
HAOTIAN QIAN ◽  
SHOULIANG QI ◽  
YUEYANG TENG ◽  
JIANLIN WU

Type 2 Diabetes Mellitus (T2DM) increases the risk of cognitive impairment (CI); however, the underlying pathophysiological mechanisms are still not well understood. We propose to clarify the altered spontaneous brain activity and functional connectivity implicated in CI of T2DM by analyzing resting state functional MRI (rs-fMRI) data. Totally 22 T2DM patients with cognitive impairment (T2DM-CI) and 31 T2DM patients with normal cognition (T2DM-NC) are included in this study. The whole brain amplitude of low frequency fluctuation (ALFF) value, regional homogeneity (ReHo) value and functional connectivity (FC) analysis using posterior cingulate cortex (PCC) as a seed region are investigated through comparison between groups of T2DM-CI and T2DM-NC. It is found that, compared with T2DM-NC, T2DM-CI demonstrates the decreased ALFF in the regions of precuneus, posterior cingulate gyrus, middle occipital gyrus and left superior/middle frontal gyrus, but the increased ALFF in the left middle frontal gyrus and left superior temporal gyrus. In T2DM-CI, ReHo decreases in bilateral posterior cingulate gyrus, right precuneus, right inferior frontal gyrus, but increases in the middle frontal gyrus and right superior occipital gyrus. Higher FC between PCC and bilateral inferior parietal lobule and right middle/inferior frontal gyrus, lower FC between PCC and bilateral precuneus and right superior frontal gyrus are observed in T2DM-CI group. Compared with T2DM-NC, patients with T2DM-CI have presented altered ALFF, ReHo and FC in and between important brain regions. The observed alterations are thought to be implicated with cognitive impairment of T2DM as the potential imaging pathophysiological basis.


Author(s):  
HONG ZHANG ◽  
YAORU SUN

Neural activation of the motor cortex has been consistently reported to be evoked in the emotion processing of facial expressions, but it is poorly understood whether and how the motor system influences the activity of limbic areas during participants’ perceived emotional expressions. In this study, we proposed that motor activations evoked by emotional processing influence the activations in limbic areas such as amygdala during the perception of facial expressions. To examine this issue, a masked priming paradigm was adopted in our fMRI experiment, which could modulate the activation within the motor cortex when healthy participants perceived sad or happy facial expressions. We found that the first presented stimulus (masked prime) in each trial reduced the activations in the premotor cortex and inferior frontal gyrus when the movement of facial muscles implied by the arrows on the prime stimulus was consistent with that implied by the target face expressions (compatible condition), but increased the activations in these two areas when the movements implied by the arrows and the target face expressions were inconsistent (incompatible condition). The superior temporal gyrus, middle cingulate gyrus and amygdala also showed similar response tendency to that in motor cortex. Moreover, psychophysiological interaction (PPI) analysis showed that both right middle cingulate gyrus and bilateral superior temporal gyrus were closely linked to the premotor cortex with inferior frontal gyrus during the incompatible trials compared with the compatible trials. Together with this result and the significant activation correlations between the motor cortex and the limbic areas, this work revealed the modulation effect of motor cortex on brain regions related to emotion perception, suggesting that motor representation of facial movements can affect emotion experience. Our results provide new evidence for the functional role of motor system in the perception of facial emotions, and could contribute to the understanding of the deficit in social interaction for patients with autism or schizophrenia.


2021 ◽  
Vol 5 (3) ◽  
pp. 13
Author(s):  
Heting Wang ◽  
Vidya Gaddy ◽  
James Ross Beveridge ◽  
Francisco R. Ortega

The role of affect has been long studied in human–computer interactions. Unlike previous studies that focused on seven basic emotions, an avatar named Diana was introduced who expresses a higher level of emotional intelligence. To adapt to the users various affects during interaction, Diana simulates emotions with dynamic facial expressions. When two people collaborated to build blocks, their affects were recognized and labeled using the Affdex SDK and a descriptive analysis was provided. When participants turned to collaborate with Diana, their subjective responses were collected and the length of completion was recorded. Three modes of Diana were involved: a flat-faced Diana, a Diana that used mimicry facial expressions, and a Diana that used emotionally responsive facial expressions. Twenty-one responses were collected through a five-point Likert scale questionnaire and the NASA TLX. Results from questionnaires were not statistically different. However, the emotionally responsive Diana obtained more positive responses, and people spent the longest time with the mimicry Diana. In post-study comments, most participants perceived facial expressions on Diana’s face as natural, four mentioned uncomfortable feelings caused by the Uncanny Valley effect.


2021 ◽  
Vol 151 ◽  
pp. 107734
Author(s):  
Katia M. Harlé ◽  
Alan N. Simmons ◽  
Jessica Bomyea ◽  
Andrea D. Spadoni ◽  
Charles T. Taylor

Sign in / Sign up

Export Citation Format

Share Document