scholarly journals Sensorimotor Activity and Network Connectivity to Dynamic and Static Emotional Faces in 7-Month-Old Infants

2021 ◽  
Vol 11 (11) ◽  
pp. 1396
Author(s):  
Ermanno Quadrelli ◽  
Elisa Roberti ◽  
Silvia Polver ◽  
Hermann Bulf ◽  
Chiara Turati

The present study investigated whether, as in adults, 7-month-old infants’ sensorimotor brain areas are recruited in response to the observation of emotional facial expressions. Activity of the sensorimotor cortex, as indexed by µ rhythm suppression, was recorded using electroencephalography (EEG) while infants observed neutral, angry, and happy facial expressions either in a static (N = 19) or dynamic (N = 19) condition. Graph theory analysis was used to investigate to which extent neural activity was functionally localized in specific cortical areas. Happy facial expressions elicited greater sensorimotor activation compared to angry faces in the dynamic experimental condition, while no difference was found between the three expressions in the static condition. Results also revealed that happy but not angry nor neutral expressions elicited a significant right-lateralized activation in the dynamic condition. Furthermore, dynamic emotional faces generated more efficient processing as they elicited higher global efficiency and lower networks’ diameter compared to static faces. Overall, current results suggest that, contrarily to neutral and angry faces, happy expressions elicit sensorimotor activity at 7 months and dynamic emotional faces are more efficiently processed by functional brain networks. Finally, current data provide evidence of the existence of a right-lateralized activity for the processing of happy facial expressions.

2016 ◽  
Vol 29 (8) ◽  
pp. 749-771 ◽  
Author(s):  
Min Hooi Yong ◽  
Ted Ruffman

Dogs respond to human emotional expressions. However, it is unknown whether dogs can match emotional faces to voices in an intermodal matching task or whether they show preferences for looking at certain emotional facial expressions over others, similar to human infants. We presented 52 domestic dogs and 24 seven-month-old human infants with two different human emotional facial expressions of the same gender simultaneously, while listening to a human voice expressing an emotion that matched one of them. Consistent with most matching studies, neither dogs nor infants looked longer at the matching emotional stimuli, yet dogs and humans demonstrated an identical pattern of looking less at sad faces when paired with happy or angry faces (irrespective of the vocal stimulus), with no preference for happyversusangry faces. Discussion focuses on why dogs and infants might have an aversion to sad faces, or alternatively, heightened interest in angry and happy faces.


2021 ◽  
Vol 12 ◽  
Author(s):  
Yuko Yamashita ◽  
Tetsuya Yamamoto

Emotional contagion is a phenomenon by which an individual’s emotions directly trigger similar emotions in others. We explored the possibility that perceiving others’ emotional facial expressions affect mood in people with subthreshold depression (sD). Around 49 participants were divided into the following four groups: participants with no depression (ND) presented with happy faces; ND participants presented with sad faces; sD participants presented with happy faces; and sD participants presented with sad faces. Participants were asked to answer an inventory about their emotional states before and after viewing the emotional faces to investigate the influence of emotional contagion on their mood. Regardless of depressive tendency, the groups presented with happy faces exhibited a slight increase in the happy mood score and a decrease in the sad mood score. The groups presented with sad faces exhibited an increased sad mood score and a decreased happy mood score. These results demonstrate that emotional contagion affects the mood in people with sD, as well as in individuals with ND. These results indicate that emotional contagion could relieve depressive moods in people with sD. It demonstrates the importance of the emotional facial expressions of those around people with sD such as family and friends from the viewpoint of emotional contagion.


2021 ◽  
Vol 11 (9) ◽  
pp. 1203 ◽  
Author(s):  
Sara Borgomaneri ◽  
Francesca Vitale ◽  
Simone Battaglia ◽  
Alessio Avenanti

The ability to rapidly process others’ emotional signals is crucial for adaptive social interactions. However, to date it is still unclear how observing emotional facial expressions affects the reactivity of the human motor cortex. To provide insights on this issue, we employed single-pulse transcranial magnetic stimulation (TMS) to investigate corticospinal motor excitability. Healthy participants observed happy, fearful and neutral pictures of facial expressions while receiving TMS over the left or right motor cortex at 150 and 300 ms after picture onset. In the early phase (150 ms), we observed an enhancement of corticospinal excitability for the observation of happy and fearful emotional faces compared to neutral expressions specifically in the right hemisphere. Interindividual differences in the disposition to experience aversive feelings (personal distress) in interpersonal emotional contexts predicted the early increase in corticospinal excitability for emotional faces. No differences in corticospinal excitability were observed at the later time (300 ms) or in the left M1. These findings support the notion that emotion perception primes the body for action and highlights the role of the right hemisphere in implementing a rapid and transient facilitatory response to emotional arousing stimuli, such as emotional facial expressions.


2021 ◽  
Author(s):  
Helio Clemente Cuve ◽  
Santiago Castiello ◽  
brook shiferaw ◽  
Eri Ichijo ◽  
Caroline Catmur ◽  
...  

Recognition of emotional facial expressions is considered to be atypical in autism. This difficulty is thought to be due to the way that facial expressions are visually explored. Evidence for atypical visual exploration of emotional faces in autism is, however, equivocal. We propose that, where observed, atypical visual exploration of emotional facial expressions is due to alexithymia, a distinct but frequently co-occurring condition. In this eye-tracking study we tested the alexithymia hypothesis using a number of recent methodological advances to study eye gaze during several emotion processing tasks (emotion recognition, intensity judgements, free gaze), in 25 adults with, and 45 without, autism. A multilevel polynomial modelling strategy was used to describe the spatiotemporal dynamics of eye gaze to emotional facial expressions. Converging evidence from traditional and novel analysis methods revealed that atypical gaze to the eyes is best predicted by alexithymia in both autistic and non-autistic individuals. Information theoretic metrics also revealed differential effects of task on gaze patterns as a function of alexithymia, but not autism. These findings highlight factors underlying atypical emotion processing in autistic individuals, with wide-ranging implications for emotion research.


2011 ◽  
Vol 41 (11) ◽  
pp. 2253-2264 ◽  
Author(s):  
L. R. Demenescu ◽  
R. Renken ◽  
R. Kortekaas ◽  
M.-J. van Tol ◽  
J. B. C. Marsman ◽  
...  

BackgroundDepression has been associated with limbic hyperactivation and frontal hypoactivation in response to negative facial stimuli. Anxiety disorders have also been associated with increased activation of emotional structures such as the amygdala and insula. This study examined to what extent activation of brain regions involved in perception of emotional faces is specific to depression and anxiety disorders in a large community-based sample of out-patients.MethodAn event-related functional magnetic resonance imaging (fMRI) paradigm was used including angry, fearful, sad, happy and neutral facial expressions. One hundred and eighty-two out-patients (59 depressed, 57 anxiety and 66 co-morbid depression-anxiety) and 56 healthy controls selected from the Netherlands Study of Depression and Anxiety (NESDA) were included in the present study. Whole-brain analyses were conducted. The temporal profile of amygdala activation was also investigated.ResultsFacial expressions activated the amygdala and fusiform gyrus in depressed patients with or without anxiety and in healthy controls, relative to scrambled faces, but this was less evident in patients with anxiety disorders. The response shape of the amygdala did not differ between groups. Depressed patients showed dorsolateral prefrontal cortex (PFC) hyperactivation in response to happy faces compared to healthy controls.ConclusionsWe suggest that stronger frontal activation to happy faces in depressed patients may reflect increased demands on effortful emotion regulation processes triggered by mood-incongruent stimuli. The lack of strong differences in neural activation to negative emotional faces, relative to healthy controls, may be characteristic of the mild-to-moderate severity of illness in this sample and may be indicative of a certain cognitive-emotional processing reserve.


2021 ◽  
Vol 15 ◽  
Author(s):  
Teresa Sollfrank ◽  
Oona Kohnen ◽  
Peter Hilfiker ◽  
Lorena C. Kegel ◽  
Hennric Jokeit ◽  
...  

This study aimed to examine whether the cortical processing of emotional faces is modulated by the computerization of face stimuli (”avatars”) in a group of 25 healthy participants. Subjects were passively viewing 128 static and dynamic facial expressions of female and male actors and their respective avatars in neutral or fearful conditions. Event-related potentials (ERPs), as well as alpha and theta event-related synchronization and desynchronization (ERD/ERS), were derived from the EEG that was recorded during the task. All ERP features, except for the very early N100, differed in their response to avatar and actor faces. Whereas the N170 showed differences only for the neutral avatar condition, later potentials (N300 and LPP) differed in both emotional conditions (neutral and fear) and the presented agents (actor and avatar). In addition, we found that the avatar faces elicited significantly stronger reactions than the actor face for theta and alpha oscillations. Especially theta EEG frequencies responded specifically to visual emotional stimulation and were revealed to be sensitive to the emotional content of the face, whereas alpha frequency was modulated by all the stimulus types. We can conclude that the computerized avatar faces affect both, ERP components and ERD/ERS and evoke neural effects that are different from the ones elicited by real faces. This was true, although the avatars were replicas of the human faces and contained similar characteristics in their expression.


2020 ◽  
Author(s):  
Sandra Naumann ◽  
Mareike Bayer ◽  
Isabel Dziobek

This study aimed to expand the understanding of the neural-temporal trajectories ofemotion processing in preschoolers using electrophysiological measures. In particular, welooked at neural responses to the repetition of emotional faces. EEG was recorded whilechildren observed sequentially presented pairs of faces. In some trials, the pair of faces wasidentical, while in others they differed with regards to the emotional expression displayed(happy, fearful or neutral). We detected greater P1 and P3 amplitudes to angry compared toneutral facial expressions, but similar amplitudes for happy compared to neutral faces. Wedid not observe modulations of the N170 by emotional facial expressions. When investigatingpreschoolers’ sensitivity to the repetition of emotional facial expressions, we found no ERPamplitudes differences for repeated vs. new emotional facial expressions. Overall, the resultssupport the idea that basic mechanisms of emotion processing are developed in preschoolperiod. The trajectory of ERP components was similar to what has been reported foryounger and older age groups, suggesting consistency of order and relative timing of differentstages of emotion processing. Additionally, findings suggest that enhanced early neuralactivation for angry vs. neutral faces is related to increased empathic behavior. More work isneeded to determine whether the repetition of an emotion leads to more effective processingduring development.


2011 ◽  
Vol 26 (S2) ◽  
pp. 1880-1880
Author(s):  
S. Surguladze ◽  
J. Radua ◽  
W. El Hage ◽  
M.L. Phillips

IntroductionIt has been previously shown that genes implicated in psychiatric disorders modulated Blood oxygenation level dependent (BOLD) effect in brain regions. These studies add to the knowledge of vulnerability to disorders.ObjectivesThis study has investigated genetic modulation of brain networks associated with emotion processing.Aim of this study was to examine the effect of two genetic markers (5HTTLPR and COMT) on BOLD effect connectivity in healthy individuals.MethodsNinety-one participants participated in four fMRI experiments (at 3T), with dynamic facial expressions of fear, anger, sadness or happiness. We explored the effect of genetic polymorphisms on empirically defined brain network commonly associated with the responses to any emotional expressions. Connectivity was examined by means of Granger analysis allowing to estimate the directionality of information flow between the defined brain regions.ResultsPerception of dynamic emotional facial expressions was commonly associated with activation of the bilateral fusiform gyrus, right superior temporal sulcus, bilateral dorso-lateral prefrontal cortex and right amygdala. The genetic modulation of this network was observed only in experiments with fearful facial expressions. There was an interaction between the effects of genetic polymorphisms and the measures of connectivity: (p = 0.0002, adjusted R2 = 18%). This was accounted for by lower connectivity in individuals lacking both copies of COMT Val polymorphism who at the same time lacked both copies of L polymorphism of 5HTTLPR gene.ConclusionsOur results clarify the mechanism of brain network reactivity to emotional signals that is associated with genetic polymorphisms.


2021 ◽  
Vol 12 ◽  
Author(s):  
Fangbing Qu ◽  
Xiaojia Shi ◽  
Aozi Zhang ◽  
Changwei Gu

Time perception is a fundamental aspect of young children’s daily lives and is affected by a number of factors. The present study aimed to investigate the precise developmental course of young children’s time perception from 3 to 5 years old and the effects of emotion localization on their time perception ability. A total of 120 children were tested using an adapted time perception task with black squares (Experiment 1) and emotional facial expressions (Experiment 2). Results suggested that children’s time perception was influenced by stimulus duration and improved gradually with increasing age. Both accuracy and reaction time were affected by the presentation sequence of emotional faces, indicating an effect of emotion localization. To summarize, young children’s time perception showed effects of age, stimulus duration, and emotion localization.


Sign in / Sign up

Export Citation Format

Share Document