Using Facial Expressions to Probe Brain Circuitry Associated With Anxiety and Depression

Author(s):  
Johnna R. Swartz ◽  
Lisa M. Shin ◽  
Brenda Lee ◽  
Ahmad R. Hariri

Emotional facial expressions are processed by a distributed corticolimbic brain circuit including the amygdala, which plays a central role in detecting and responding to emotional expressions, and the prefrontal cortex, which evaluates, integrates, and regulates responses to emotional expressions. Using functional magnetic resonance imaging (fMRI) to probe circuit function can reveal insights into the pathophysiology of mood and anxiety disorders. In this chapter, we review fMRI research into corticolimbic circuit processing of emotional facial expressions in social anxiety disorder, posttraumatic stress disorder, generalized anxiety disorder, panic disorder, specific phobia, and major depressive disorder. We conclude by reviewing recent research examining how variability in circuit function may help predict the future experience of symptoms in young adults and at-risk adolescents, as well as how such variability relates to personality traits associated with psychopathology risk.

2002 ◽  
Vol 19 (2) ◽  
pp. 90-101 ◽  
Author(s):  
Raelene L. de Ross ◽  
Eleonora Gullone ◽  
Bruce F. Chorpita

AbstractThe Revised Child Anxiety and Depression Scale (RCADS) is a 47-item self-report measure intended to assess children's symptoms corresponding to selected DSM-IV anxiety and major depressive disorders. The scale comprises six subscales (e.g., Separation Anxiety Disorder; Social Phobia; Obsessive Compulsive Disorder; Panic Disorder; Generalised Anxiety Disorder; and Major Depressive Disorder). To date, only one normative study of youth has been published with results providing strong initial support for the reliability and validity of this new measure (Chorpita, Yim, Moffitt, Umemoto, & Francis, 2000). The present investigation provides additional psychometric data derived from an Australian sample comprising 405 youth aged 8 to 18 years. In general, the data were found to be consistent with those reported in the initial normative study. Internal consistency for the overall scale and its subscales was found to be adequate. Good convergent validity was demonstrated through moderate to strong correlations between the subscales of the RCADS with scores on the Revised Children's Manifest Anxiety Scale (RCMAS) and the Children's Depression Inventory (CDI). Confirmatory factor analysis suggested reasonable fit for the six-factor model by Chorpita et al. (2000). Notwithstanding the need for additional validation, it is concluded that the RCADS is a promising instrument for use in both clinical and research settings.


2007 ◽  
Vol 38 (10) ◽  
pp. 1475-1483 ◽  
Author(s):  
K. S. Kendler ◽  
L. J. Halberstadt ◽  
F. Butera ◽  
J. Myers ◽  
T. Bouchard ◽  
...  

BackgroundWhile the role of genetic factors in self-report measures of emotion has been frequently studied, we know little about the degree to which genetic factors influence emotional facial expressions.MethodTwenty-eight pairs of monozygotic (MZ) and dizygotic (DZ) twins from the Minnesota Study of Twins Reared Apart were shown three emotion-inducing films and their facial responses recorded. These recordings were blindly scored by trained raters. Ranked correlations between twins were calculated controlling for age and sex.ResultsTwin pairs were significantly correlated for facial expressions of general positive emotions, happiness, surprise and anger, but not for general negative emotions, sadness, or disgust or average emotional intensity. MZ pairs (n=18) were more correlated than DZ pairs (n=10) for most but not all emotional expressions.ConclusionsSince these twin pairs had minimal contact with each other prior to testing, these results support significant genetic effects on the facial display of at least some human emotions in response to standardized stimuli. The small sample size resulted in estimated twin correlations with very wide confidence intervals.


2016 ◽  
Vol 29 (8) ◽  
pp. 749-771 ◽  
Author(s):  
Min Hooi Yong ◽  
Ted Ruffman

Dogs respond to human emotional expressions. However, it is unknown whether dogs can match emotional faces to voices in an intermodal matching task or whether they show preferences for looking at certain emotional facial expressions over others, similar to human infants. We presented 52 domestic dogs and 24 seven-month-old human infants with two different human emotional facial expressions of the same gender simultaneously, while listening to a human voice expressing an emotion that matched one of them. Consistent with most matching studies, neither dogs nor infants looked longer at the matching emotional stimuli, yet dogs and humans demonstrated an identical pattern of looking less at sad faces when paired with happy or angry faces (irrespective of the vocal stimulus), with no preference for happyversusangry faces. Discussion focuses on why dogs and infants might have an aversion to sad faces, or alternatively, heightened interest in angry and happy faces.


2020 ◽  
Author(s):  
Sjoerd Stuit ◽  
Timo Kootstra ◽  
David Terburg ◽  
Carlijn van den Boomen ◽  
Maarten van der Smagt ◽  
...  

Abstract Emotional facial expressions are important visual communication signals that indicate a sender’s intent and emotional state to an observer. As such, it is not surprising that reactions to different expressions are thought to be automatic and independent of awareness. What is surprising, is that studies show inconsistent results concerning such automatic reactions, particularly when using different face stimuli. We argue that automatic reactions to facial expressions can be better explained, and better understood, in terms of quantitative descriptions of their visual features rather than in terms of the semantic labels (e.g. angry) of the expressions. Here, we focused on overall spatial frequency (SF) and localized Histograms of Oriented Gradients (HOG) features. We used machine learning classification to reveal the SF and HOG features that are sufficient for classification of the first selected face out of two simultaneously presented faces. In other words, we show which visual features predict selection between two faces. Interestingly, the identified features serve as better predictors than the semantic label of the expressions. We therefore propose that our modelling approach can further specify which visual features drive the behavioural effects related to emotional expressions, which can help solve the inconsistencies found in this line of research.


Author(s):  
Julia K. Langer ◽  
Thomas L. Rodebaugh

Social anxiety disorder (SAD) and major depressive disorder (MDD) are prevalent disorders that exhibit a high rate of co-occurrence. Furthermore, these disorders have been shown to be associated with each other, suggesting that the presence of one disorder increases risk for the other disorder. In this chapter, we discuss relevant theories that attempt to explain why SAD and MDD are related. We propose that the available evidence provides support for conceptualizing the comorbidity of SAD and MDD as resulting from a shared underlying vulnerability. There is evidence that this underlying vulnerability is genetic in nature and related to trait-like constructs such as positive and negative affect. We also discuss the possibility that the underlying vulnerability may confer tendencies toward certain patterns of thinking. Finally, we discuss theories that propose additional causal pathways between the disorders such as direct pathways from one disorder to the other. We advocate for a psychoevolutionary conceptualization that links the findings on the underlying cognitions to the shared relation of lower positive affect and the findings on peer victimization. We suggest that, in addition to a shared underlying vulnerability, the symptoms of social anxiety and depression may function as a part of a behavior trap in which attempts to cope with perceived social exclusion lead to even higher levels of social anxiety and depression. Finally, we make recommendations for the best methods for assessing SAD and MDD as well as suggestions for treating individuals with both disorders.


2002 ◽  
Vol 8 (1) ◽  
pp. 130-135 ◽  
Author(s):  
JOCELYN M. KEILLOR ◽  
ANNA M. BARRETT ◽  
GREGORY P. CRUCIAN ◽  
SARAH KORTENKAMP, ◽  
KENNETH M. HEILMAN

The facial feedback hypothesis suggests that facial expressions are either necessary or sufficient to produce emotional experience. Researchers have noted that the ideal test of the necessity aspect of this hypothesis would be an evaluation of emotional experience in a patient suffering from a bilateral facial paralysis; however, this condition is rare and no such report has been documented. We examined the role of facial expressions in the determination of emotion by studying a patient (F.P.) suffering from a bilateral facial paralysis. Despite her inability to convey emotions through facial expressions, F.P. reported normal emotional experience. When F.P. viewed emotionally evocative slides her reactions were not dampened relative to the normative sample. F.P. retained her ability to detect, discriminate, and image emotional expressions. These findings are not consistent with theories stating that feedback from an active face is necessary to experience emotion, or to process emotional facial expressions. (JINS, 2002, 8, 130–135.)


2006 ◽  
Vol 47 (11) ◽  
pp. 1107-1115 ◽  
Author(s):  
Cecile D. Ladouceur ◽  
Ronald E. Dahl ◽  
Douglas E. Williamson ◽  
Boris Birmaher ◽  
David A. Axelson ◽  
...  

2021 ◽  
Vol 54 (1) ◽  
pp. 47-58
Author(s):  
Uwe Altmann ◽  
Catharina Friemann ◽  
Theresa S. Frank ◽  
Mareike C. Sittler ◽  
Désirée Schoenherr ◽  
...  

Introduction: Adult attachment is commonly associated with emotion regulation. Less is known about the nonverbal embodiment of adult attachment. Objective: We hypothesized that dismissing attachment is related to less movement and fewer facial expressions of emotions, whereas preoccupied attachment is associated with more negative emotional facial expressions. Moreover, the interaction of attachment and the presence of an anxiety disorder (AD) was explored. Methods: The sample included 95 individuals, 21 with AD without comorbidity, 21 with AD and comorbid major depression (AD-CD), and 53 healthy controls. We analyzed nonverbal behavior during a part of the Adult Attachment Interview (AAI) asking about the family and parental figures. The movements of the interviewees were captured via Motion Energy Analysis. Facial expressions were coded according to the Facial Action Coding System using the OpenFace software. We compared individuals with secure, dismissing, and preoccupied states of mind (assessed with the AAI) with regard to the frequency and complexity of movements and the frequency of the facial expressions such as happy, sad, and contemptuous. Results: As expected, dismissingly attached individuals moved less often and with lower complexity than securely attached. For emotional facial expressions, a main effect of the disorder group and interaction effects of attachment by disorder were found. In the AD-CD group, dismissingly attached patients showed comparatively fewer happy facial expressions than securely attached individuals. Conclusions: Reduced movement specifically seems to be related to dismissing attachment when interviewees talk about significant parental figures. Facial expressions of emotions related to attachment occurred when maladaptive emotion regulation strategies were intensified by a psychological disorder.


10.2196/22423 ◽  
2020 ◽  
Vol 7 (12) ◽  
pp. e22423
Author(s):  
Vincent Israel Ouoku Agyapong ◽  
Marianne Hrabok ◽  
Wesley Vuong ◽  
Reham Shalaby ◽  
Jasmine Marie Noble ◽  
...  

Background In addition to the obvious physical medical impact of COVID-19, the disease poses evident threats to people’s mental health, psychological safety, and well-being. Provision of support for these challenges is complicated by the high number of people requiring support and the need to maintain physical distancing. Text4Hope, a daily supportive SMS text messaging program, was launched in Canada to mitigate the negative mental health impacts of the pandemic among Canadians. Objective This paper describes the changes in the stress, anxiety, and depression levels of subscribers to the Text4Hope program after 6 weeks of exposure to daily supportive SMS text messages. Methods We used self-administered, empirically supported web-based questionnaires to assess the demographic and clinical characteristics of Text4Hope subscribers. Perceived stress, anxiety, and depression were measured with the 10-Item Perceived Stress Scale (PSS-10), the Generalized Anxiety Disorder–7 (GAD-7) scale, and the Patient Health Questionnaire–9 (PHQ-9) scale at baseline and sixth week time points. Moderate or high perceived stress, likely generalized anxiety disorder, and likely major depressive disorder were assessed using cutoff scores of ≥14 for the PSS-10, ≥10 for the GAD-7, and ≥10 for the PHQ-9, respectively. At 6 weeks into the program, 766 participants had completed the questionnaires at both time points. Results At the 6-week time point, there were statistically significant reductions in mean scores on the PSS-10 and GAD-7 scales but not on the PHQ-9 scale. Effect sizes were small overall. There were statistically significant reductions in the prevalence rates of moderate or high stress and likely generalized anxiety disorder but not likely major depressive disorder for the group that completed both the baseline and 6-week assessments. The largest reductions in mean scores and prevalence rates were for anxiety (18.7% and 13.5%, respectively). Conclusions Text4Hope is a convenient, cost-effective, and accessible means of implementing a population-level psychological intervention. This service demonstrated significant reductions in anxiety and stress levels during the COVID-19 pandemic and could be used as a population-level mental health intervention during natural disasters and other emergencies. International Registered Report Identifier (IRRID) RR2-10.2196/19292


Sign in / Sign up

Export Citation Format

Share Document