scholarly journals Preschoolers’ sensitivity to emotional facial expressions and their repetition: An ERP study

2020 ◽  
Author(s):  
Sandra Naumann ◽  
Mareike Bayer ◽  
Isabel Dziobek

This study aimed to expand the understanding of the neural-temporal trajectories ofemotion processing in preschoolers using electrophysiological measures. In particular, welooked at neural responses to the repetition of emotional faces. EEG was recorded whilechildren observed sequentially presented pairs of faces. In some trials, the pair of faces wasidentical, while in others they differed with regards to the emotional expression displayed(happy, fearful or neutral). We detected greater P1 and P3 amplitudes to angry compared toneutral facial expressions, but similar amplitudes for happy compared to neutral faces. Wedid not observe modulations of the N170 by emotional facial expressions. When investigatingpreschoolers’ sensitivity to the repetition of emotional facial expressions, we found no ERPamplitudes differences for repeated vs. new emotional facial expressions. Overall, the resultssupport the idea that basic mechanisms of emotion processing are developed in preschoolperiod. The trajectory of ERP components was similar to what has been reported foryounger and older age groups, suggesting consistency of order and relative timing of differentstages of emotion processing. Additionally, findings suggest that enhanced early neuralactivation for angry vs. neutral faces is related to increased empathic behavior. More work isneeded to determine whether the repetition of an emotion leads to more effective processingduring development.

2021 ◽  
Author(s):  
Helio Clemente Cuve ◽  
Santiago Castiello ◽  
brook shiferaw ◽  
Eri Ichijo ◽  
Caroline Catmur ◽  
...  

Recognition of emotional facial expressions is considered to be atypical in autism. This difficulty is thought to be due to the way that facial expressions are visually explored. Evidence for atypical visual exploration of emotional faces in autism is, however, equivocal. We propose that, where observed, atypical visual exploration of emotional facial expressions is due to alexithymia, a distinct but frequently co-occurring condition. In this eye-tracking study we tested the alexithymia hypothesis using a number of recent methodological advances to study eye gaze during several emotion processing tasks (emotion recognition, intensity judgements, free gaze), in 25 adults with, and 45 without, autism. A multilevel polynomial modelling strategy was used to describe the spatiotemporal dynamics of eye gaze to emotional facial expressions. Converging evidence from traditional and novel analysis methods revealed that atypical gaze to the eyes is best predicted by alexithymia in both autistic and non-autistic individuals. Information theoretic metrics also revealed differential effects of task on gaze patterns as a function of alexithymia, but not autism. These findings highlight factors underlying atypical emotion processing in autistic individuals, with wide-ranging implications for emotion research.


2017 ◽  
Vol 29 (5) ◽  
pp. 1749-1761 ◽  
Author(s):  
Johanna Bick ◽  
Rhiannon Luyster ◽  
Nathan A. Fox ◽  
Charles H. Zeanah ◽  
Charles A. Nelson

AbstractWe examined facial emotion recognition in 12-year-olds in a longitudinally followed sample of children with and without exposure to early life psychosocial deprivation (institutional care). Half of the institutionally reared children were randomized into foster care homes during the first years of life. Facial emotion recognition was examined in a behavioral task using morphed images. This same task had been administered when children were 8 years old. Neutral facial expressions were morphed with happy, sad, angry, and fearful emotional facial expressions, and children were asked to identify the emotion of each face, which varied in intensity. Consistent with our previous report, we show that some areas of emotion processing, involving the recognition of happy and fearful faces, are affected by early deprivation, whereas other areas, involving the recognition of sad and angry faces, appear to be unaffected. We also show that early intervention can have a lasting positive impact, normalizing developmental trajectories of processing negative emotions (fear) into the late childhood/preadolescent period.


2013 ◽  
Vol 25 (4) ◽  
pp. 547-557 ◽  
Author(s):  
Maital Neta ◽  
William M. Kelley ◽  
Paul J. Whalen

Extant research has examined the process of decision making under uncertainty, specifically in situations of ambiguity. However, much of this work has been conducted in the context of semantic and low-level visual processing. An open question is whether ambiguity in social signals (e.g., emotional facial expressions) is processed similarly or whether a unique set of processors come on-line to resolve ambiguity in a social context. Our work has examined ambiguity using surprised facial expressions, as they have predicted both positive and negative outcomes in the past. Specifically, whereas some people tended to interpret surprise as negatively valenced, others tended toward a more positive interpretation. Here, we examined neural responses to social ambiguity using faces (surprise) and nonface emotional scenes (International Affective Picture System). Moreover, we examined whether these effects are specific to ambiguity resolution (i.e., judgments about the ambiguity) or whether similar effects would be demonstrated for incidental judgments (e.g., nonvalence judgments about ambiguously valenced stimuli). We found that a distinct task control (i.e., cingulo-opercular) network was more active when resolving ambiguity. We also found that activity in the ventral amygdala was greater to faces and scenes that were rated explicitly along the dimension of valence, consistent with findings that the ventral amygdala tracks valence. Taken together, there is a complex neural architecture that supports decision making in the presence of ambiguity: (a) a core set of cortical structures engaged for explicit ambiguity processing across stimulus boundaries and (b) other dedicated circuits for biologically relevant learning situations involving faces.


1984 ◽  
Vol 7 (2) ◽  
pp. 193-214 ◽  
Author(s):  
Merry Bullock ◽  
James A. Russell

A structural model of emotions was used to reveal patterns in how children interpret the emotional facial expressions of others. Three-, four-, five-year-olds, and adults (n = 38 in each group) were asked to match 15 emotion-descriptive words (happy, excited, surprised, afraid, scared, angry, mad, disgusted, miserable, sad, sleepy, calm, relaxed, wide awake, and, as a check on response bias, insipid) with still photographs of actors showing different facial expressions. Whereas prior research had indicated that preschool-aged children are "inaccurate" in associating labels with faces, our results indicated that that research may have severely underestimated children's knowledge of emotions. In this study children used terms systematically to refer to a specifiable range of expressions, centered around a focal point. Multidimensional scaling of the word/facial expression associations yielded a two-dimensional structure able to account for the interrelationships among emotions, and this structure was the same for all age groups. The nature of this structure, the blurry boundaries between emotion words, and developmental shifts in the referents of emotion words suggested the primacy of two dimensions, pleasure-displeasure and arousal-sleep, in children's interpretation of emotion.


2016 ◽  
Vol 29 (8) ◽  
pp. 749-771 ◽  
Author(s):  
Min Hooi Yong ◽  
Ted Ruffman

Dogs respond to human emotional expressions. However, it is unknown whether dogs can match emotional faces to voices in an intermodal matching task or whether they show preferences for looking at certain emotional facial expressions over others, similar to human infants. We presented 52 domestic dogs and 24 seven-month-old human infants with two different human emotional facial expressions of the same gender simultaneously, while listening to a human voice expressing an emotion that matched one of them. Consistent with most matching studies, neither dogs nor infants looked longer at the matching emotional stimuli, yet dogs and humans demonstrated an identical pattern of looking less at sad faces when paired with happy or angry faces (irrespective of the vocal stimulus), with no preference for happyversusangry faces. Discussion focuses on why dogs and infants might have an aversion to sad faces, or alternatively, heightened interest in angry and happy faces.


2021 ◽  
Vol 12 ◽  
Author(s):  
Yuko Yamashita ◽  
Tetsuya Yamamoto

Emotional contagion is a phenomenon by which an individual’s emotions directly trigger similar emotions in others. We explored the possibility that perceiving others’ emotional facial expressions affect mood in people with subthreshold depression (sD). Around 49 participants were divided into the following four groups: participants with no depression (ND) presented with happy faces; ND participants presented with sad faces; sD participants presented with happy faces; and sD participants presented with sad faces. Participants were asked to answer an inventory about their emotional states before and after viewing the emotional faces to investigate the influence of emotional contagion on their mood. Regardless of depressive tendency, the groups presented with happy faces exhibited a slight increase in the happy mood score and a decrease in the sad mood score. The groups presented with sad faces exhibited an increased sad mood score and a decreased happy mood score. These results demonstrate that emotional contagion affects the mood in people with sD, as well as in individuals with ND. These results indicate that emotional contagion could relieve depressive moods in people with sD. It demonstrates the importance of the emotional facial expressions of those around people with sD such as family and friends from the viewpoint of emotional contagion.


2011 ◽  
Vol 41 (11) ◽  
pp. 2375-2384 ◽  
Author(s):  
F. Ashworth ◽  
A. Pringle ◽  
R. Norbury ◽  
C. J. Harmer ◽  
P. J. Cowen ◽  
...  

BackgroundProcessing emotional facial expressions is of interest in eating disorders (EDs) as impairments in recognizing and understanding social cues might underlie the interpersonal difficulties experienced by these patients. Disgust and anger are of particular theoretical and clinical interest. The current study investigated the neural response to facial expressions of anger and disgust in bulimia nervosa (BN).MethodParticipants were 12 medication-free women with BN in an acute episode (mean age 24 years), and 16 age-, gender- and IQ-matched healthy volunteers (HVs). Functional magnetic resonance imaging (fMRI) was used to examine neural responses to angry and disgusted facial expressions.ResultsCompared with HVs, patients with BN had a decreased neural response in the precuneus to facial expressions of both anger and disgust and a decreased neural response to angry facial expressions in the right amygdala.ConclusionsThe neural response to emotional facial expressions in BN differs from that found in HVs. The precuneus response may be consistent with the application of mentalization theory to EDs, and the amygdala response with relevant ED theory. The findings are preliminary, but novel, and require replication in a larger sample.


2021 ◽  
Vol 11 (9) ◽  
pp. 1203 ◽  
Author(s):  
Sara Borgomaneri ◽  
Francesca Vitale ◽  
Simone Battaglia ◽  
Alessio Avenanti

The ability to rapidly process others’ emotional signals is crucial for adaptive social interactions. However, to date it is still unclear how observing emotional facial expressions affects the reactivity of the human motor cortex. To provide insights on this issue, we employed single-pulse transcranial magnetic stimulation (TMS) to investigate corticospinal motor excitability. Healthy participants observed happy, fearful and neutral pictures of facial expressions while receiving TMS over the left or right motor cortex at 150 and 300 ms after picture onset. In the early phase (150 ms), we observed an enhancement of corticospinal excitability for the observation of happy and fearful emotional faces compared to neutral expressions specifically in the right hemisphere. Interindividual differences in the disposition to experience aversive feelings (personal distress) in interpersonal emotional contexts predicted the early increase in corticospinal excitability for emotional faces. No differences in corticospinal excitability were observed at the later time (300 ms) or in the left M1. These findings support the notion that emotion perception primes the body for action and highlights the role of the right hemisphere in implementing a rapid and transient facilitatory response to emotional arousing stimuli, such as emotional facial expressions.


2011 ◽  
Vol 41 (11) ◽  
pp. 2253-2264 ◽  
Author(s):  
L. R. Demenescu ◽  
R. Renken ◽  
R. Kortekaas ◽  
M.-J. van Tol ◽  
J. B. C. Marsman ◽  
...  

BackgroundDepression has been associated with limbic hyperactivation and frontal hypoactivation in response to negative facial stimuli. Anxiety disorders have also been associated with increased activation of emotional structures such as the amygdala and insula. This study examined to what extent activation of brain regions involved in perception of emotional faces is specific to depression and anxiety disorders in a large community-based sample of out-patients.MethodAn event-related functional magnetic resonance imaging (fMRI) paradigm was used including angry, fearful, sad, happy and neutral facial expressions. One hundred and eighty-two out-patients (59 depressed, 57 anxiety and 66 co-morbid depression-anxiety) and 56 healthy controls selected from the Netherlands Study of Depression and Anxiety (NESDA) were included in the present study. Whole-brain analyses were conducted. The temporal profile of amygdala activation was also investigated.ResultsFacial expressions activated the amygdala and fusiform gyrus in depressed patients with or without anxiety and in healthy controls, relative to scrambled faces, but this was less evident in patients with anxiety disorders. The response shape of the amygdala did not differ between groups. Depressed patients showed dorsolateral prefrontal cortex (PFC) hyperactivation in response to happy faces compared to healthy controls.ConclusionsWe suggest that stronger frontal activation to happy faces in depressed patients may reflect increased demands on effortful emotion regulation processes triggered by mood-incongruent stimuli. The lack of strong differences in neural activation to negative emotional faces, relative to healthy controls, may be characteristic of the mild-to-moderate severity of illness in this sample and may be indicative of a certain cognitive-emotional processing reserve.


Sign in / Sign up

Export Citation Format

Share Document