Different neural and cognitive response to emotional faces in healthy monozygotic twins at risk of depression

2014 ◽  
Vol 45 (7) ◽  
pp. 1447-1458 ◽  
Author(s):  
K. W. Miskowiak ◽  
L. Glerup ◽  
C. Vestbo ◽  
C. J. Harmer ◽  
A. Reinecke ◽  
...  

BackgroundNegative cognitive bias and aberrant neural processing of emotional faces are trait-marks of depression. Yet it is unclear whether these changes constitute an endophenotype for depression and are also present in healthy individuals with hereditary risk for depression.MethodThirty healthy, never-depressed monozygotic (MZ) twins with a co-twin history of depression (high risk group: n = 13) or without co-twin history of depression (low-risk group: n = 17) were enrolled in a functional magnetic resonance imaging (fMRI) study. During fMRI, participants viewed fearful and happy faces while performing a gender discrimination task. After the scan, they were given a faces dot-probe task, a facial expression recognition task and questionnaires assessing mood, personality traits and coping strategies.ResultsHigh-risk twins showed increased neural response to happy and fearful faces in dorsal anterior cingulate cortex (ACC), dorsomedial prefrontal cortex (dmPFC), pre-supplementary motor area and occipito-parietal regions compared to low-risk twins. They also displayed stronger negative coupling between amygdala and pregenual ACC, dmPFC and temporo-parietal regions during emotional face processing. These task-related changes in neural responses in high-risk twins were accompanied by impaired gender discrimination performance during face processing. They also displayed increased attention vigilance for fearful faces and were slower at recognizing facial expressions relative to low-risk controls. These effects occurred in the absence of differences between groups in mood, subjective state or coping.ConclusionsDifferent neural response and functional connectivity within fronto-limbic and occipito-parietal regions during emotional face processing and enhanced fear vigilance may be key endophenotypes for depression.

2017 ◽  
Vol 47 (13) ◽  
pp. 2345-2357 ◽  
Author(s):  
K. W. Miskowiak ◽  
A. M. B. Svendsen ◽  
C. J. Harmer ◽  
R. Elliott ◽  
J. Macoveanu ◽  
...  

BackgroundNegative bias and aberrant neural processing of emotional faces are trait-marks of depression but findings in healthy high-risk groups are conflicting.MethodsHealthy middle-aged dizygotic twins (N = 42) underwent functional magnetic resonance imaging (fMRI): 22 twins had a co-twin history of depression (high-risk) and 20 were without co-twin history of depression (low-risk). During fMRI, participants viewed fearful and happy faces while performing a gender discrimination task. After the scan, they were given a faces dot-probe task, a facial expression recognition task and questionnaires assessing mood, personality traits and coping.ResultsUnexpectedly, high-risk twins showed reduced fear vigilance and lower recognition of fear and happiness relative to low-risk twins. During face processing in the scanner, high-risk twins displayed distinct negative functional coupling between the amygdala and ventral prefrontal cortex and pregenual anterior cingulate. This was accompanied by greater fear-specific fronto-temporal response and reduced fronto-occipital response to all emotional faces relative to baseline. The risk groups showed no differences in mood, subjective state or coping.ConclusionsLess susceptibility to fearful faces and negative cortico-limbic coupling during emotional face processing may reflect neurocognitive compensatory mechanisms in middle-aged dizygotic twins who remain healthy despite their familial risk of depression.


2021 ◽  
Author(s):  
Kristina Safar

Experience is suggested to shape the development of emotion processing abilities in infancy. The current dissertation investigated the influence of familiarity with particular face types and emotional faces on emotional face processing within the first year of life using a variety of metrics. The first study examined whether experience with a particular face type (own- vs. other-race faces) affected 6- and 9-month-old infants’ attentional looking preference to fearful facial expressions in a visual paired-comparison (VPC) task. Six-month-old infants showed an attentional preference for fearful over happy facial expressions when expressed by own-race faces, but not other race-faces, whereas 9-month-old infants showed an attentional preference for fearful expressions when expressed by both own-race and other-race faces, suggesting that experience influences how infants deploy their attention to different facial expressions. Using a longitudinal design, the second study examined whether exposure to emotional faces via picture book training at 3 months of age affected infants’ allocation of attention to fearful over happy facial expressions in both a VPC and ERP task at 5 months of age. In the VPC task, 3- and 5-month-olds without exposure to emotional faces demonstrated greater allocation of attention to fearful facial expressions. Differential exposure to emotional faces revealed a potential effect of training: 5-month-olds infants who experienced fearful faces showed an attenuated preference for fearful facial expressions compared to infants who experienced happy faces or no training. Three- and 5-month-old infants did not, however, show differential neural processing of happy and fearful facial expressions. The third study examined whether 5- and 7-month-old infants can match fearful and happy faces and voices in an intermodal preference task, and whether exposure to happy or fearful faces influences this ability. Neither 5- nor 7-month-old infants showed intermodal matching of happy or fearful facial expressions, regardless of exposure to emotional faces. Overall, results from this series of studies add to our understanding of how experience influences the development of emotional face processing in infancy.


2021 ◽  
Author(s):  
Kristina Safar

Experience is suggested to shape the development of emotion processing abilities in infancy. The current dissertation investigated the influence of familiarity with particular face types and emotional faces on emotional face processing within the first year of life using a variety of metrics. The first study examined whether experience with a particular face type (own- vs. other-race faces) affected 6- and 9-month-old infants’ attentional looking preference to fearful facial expressions in a visual paired-comparison (VPC) task. Six-month-old infants showed an attentional preference for fearful over happy facial expressions when expressed by own-race faces, but not other race-faces, whereas 9-month-old infants showed an attentional preference for fearful expressions when expressed by both own-race and other-race faces, suggesting that experience influences how infants deploy their attention to different facial expressions. Using a longitudinal design, the second study examined whether exposure to emotional faces via picture book training at 3 months of age affected infants’ allocation of attention to fearful over happy facial expressions in both a VPC and ERP task at 5 months of age. In the VPC task, 3- and 5-month-olds without exposure to emotional faces demonstrated greater allocation of attention to fearful facial expressions. Differential exposure to emotional faces revealed a potential effect of training: 5-month-olds infants who experienced fearful faces showed an attenuated preference for fearful facial expressions compared to infants who experienced happy faces or no training. Three- and 5-month-old infants did not, however, show differential neural processing of happy and fearful facial expressions. The third study examined whether 5- and 7-month-old infants can match fearful and happy faces and voices in an intermodal preference task, and whether exposure to happy or fearful faces influences this ability. Neither 5- nor 7-month-old infants showed intermodal matching of happy or fearful facial expressions, regardless of exposure to emotional faces. Overall, results from this series of studies add to our understanding of how experience influences the development of emotional face processing in infancy.


2010 ◽  
Vol 22 (3) ◽  
pp. 474-481 ◽  
Author(s):  
Disa Anna Sauter ◽  
Martin Eimer

The rapid detection of affective signals from conspecifics is crucial for the survival of humans and other animals; if those around you are scared, there is reason for you to be alert and to prepare for impending danger. Previous research has shown that the human brain detects emotional faces within 150 msec of exposure, indicating a rapid differentiation of visual social signals based on emotional content. Here we use event-related brain potential (ERP) measures to show for the first time that this mechanism extends to the auditory domain, using human nonverbal vocalizations, such as screams. An early fronto-central positivity to fearful vocalizations compared with spectrally rotated and thus acoustically matched versions of the same sounds started 150 msec after stimulus onset. This effect was also observed for other vocalized emotions (achievement and disgust), but not for affectively neutral vocalizations, and was linked to the perceived arousal of an emotion category. That the timing, polarity, and scalp distribution of this new ERP correlate are similar to ERP markers of emotional face processing suggests that common supramodal brain mechanisms may be involved in the rapid detection of affectively relevant visual and auditory signals.


2019 ◽  
Vol 9 (1) ◽  
Author(s):  
Nathan Caruana ◽  
Christine Inkley ◽  
Marwa El Zein ◽  
Kiley Seymour

Abstract The human brain has evolved specialised mechanisms to enable the rapid detection of threat cues, including emotional face expressions (e.g., fear and anger). However, contextual cues – such as gaze direction – influence the ability to recognise emotional expressions. For instance, anger paired with direct gaze, and fear paired with averted gaze are more accurately recognised compared to alternate conjunctions of these features. It is argued that this is because gaze direction conveys the relevance and locus of the threat to the observer. Here, we used continuous flash suppression (CFS) to assess whether the modulatory effect of gaze direction on emotional face processing occurs outside of conscious awareness. Previous research using CFS has demonstrated that fearful facial expressions are prioritised by the visual system and gain privileged access to awareness over other expressed emotions. We hypothesised that if the modulatory effects of gaze on emotional face processing occur also at this level, then the gaze-emotion conjunctions signalling self-relevant threat will reach awareness faster than those that do not. We report that fearful faces gain privileged access to awareness over angry faces, but that gaze direction does not modulate this effect. Thus, our findings suggest that previously reported effects of gaze direction on emotional face processing are likely to occur once the face is detected, where the self-relevance and locus of the threat can be consciously appraised.


2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Adrián Alacreu-Crespo ◽  
Emilie Olié ◽  
Emmanuelle Le Bars ◽  
Fabienne Cyprien ◽  
Jérémy Deverdun ◽  
...  

Abstract Emotional feedback, such as faces showing emotions, can influence decision making. Decision making and emotional face processing, mainly mediated by the prefrontal and cingulate cortices, are impaired in suicide attempters. Here, we used functional MRI (fMRI) to study prefrontal activation in suicide attempters during a modified version of the Iowa Gambling Task (IGT) that included emotional face feedback. We randomly distributed the 116 euthymic women (n = 45 suicide attempters, n = 41 affective controls with history of depression without suicide attempt, and n = 30 healthy controls) included in the study in three emotional IGT groups: concordant (safe and risky choices followed by happy and angry faces, respectively), discordant (safe and risky choices followed by angry and happy faces, respectively), and neutral condition (safe and risky choices followed by neutral faces). Considering the two IGT phases (ambiguous and risky), we then analyzed five regions of interest during the risky vs. safe choices: orbitofrontal (OFC), anterior cingulate (ACC), ventrolateral (VLPFC), medial (MPFC) and dorsal prefrontal (DPFC) cortices. We found: (1) impaired decision making and increased DPFC and OFC activation in suicide attempters vs. controls in the discordant condition during the risky phase; (2) reduced VLPFC activation in suicide attempters in the concordant condition during the ambiguous phase; and (3) decreased OFC, ACC and DPFC activation in both control groups in the concordant condition during the ambiguous phase. Suicide attempters showed prefrontal alterations during reward-learning decision making with emotional feedback. Suicide attempters may guide their decisions to avoid social negative feedback despite the expected outcome.


2008 ◽  
Vol 39 (01) ◽  
Author(s):  
M Adamaszek ◽  
M Weymar ◽  
J Berneiser ◽  
A Dressel ◽  
C Kessler ◽  
...  

2021 ◽  
Vol 42 (7) ◽  
pp. 2099-2114
Author(s):  
Charis Styliadis ◽  
Rachel Leung ◽  
Selin Özcan ◽  
Eric A. Moulton ◽  
Elizabeth Pang ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document