Volition inhibitory to emotional stimuli in depression: Evidence from the antisaccade tasks

2011 ◽  
Vol 26 (S2) ◽  
pp. 707-707
Author(s):  
X. Luo ◽  
R. Chen ◽  
W. Guo ◽  
H. Zhang ◽  
R. Zhou

IntroductionMost previous researches indicated that impaired inhibition to emotional stimuli could be one of the important cognitive characteristics of depression individuals. The antisaccade tasks which composed of prosaccade task (PS) and antisaccade task (AS) were often used to investigate response inhibition.AimsThis study aimed to investigate the volition inhibition toward emotional stimuli in depressed mood undergraduates (DM).MethodsSubjects were grouped as 21 DM and 25 non-depressed undergraduates (ND) on the Beck Depression Inventory and Self-rating Depression Scale. The antisaccade tasks were conducted to examine the inhibition abilities by varying the arousal level of volition (low and high) of the tasks, with happy, neutral and sad facial expressions as stimuli.ResultsThe results showed that at the low volition level in the AS condition, the correct saccade latency in the DM were significant slower than the ND; The DM had reliable higher direction error rates in response to emotional facial expressions, especially for sad expressions. However, all of the differences disappeared in the high volition level antisaccade tasks. The amplitude errors data were not influenced by emotional facial expressions, and there were no group differences across tasks.ConclusionsThese results indicated the DM showed slower speed of cognitive processing and impaired inhibition abilities toward emotional faces than the ND, particularly for sad faces, but these abilities will be repaired in the high arousal level of volition, which enlighten us that training the DM's volition level of inhibition could prove to be an effective strategy to alleviate depression.

2020 ◽  
Author(s):  
Motonori Yamaguchi ◽  
Jack Dylan Moore ◽  
Sarah Hendry ◽  
Felicity Wolohan

The emotional basis of cognitive control has been investigated in the flanker task with various procedures and materials across different studies. The present study examined the issue with the same flanker task but with different types of emotional stimuli and design. In seven experiments, the flanker effect and its sequential modulation according to the preceding trial type were assessed. Experiments 1 and 2 used affective pictures and emotional facial expressions as emotional stimuli, and positive and negative stimuli were intermixed. There was little evidence that emotional stimuli influenced cognitive control. Experiments 3 and 4 used the same affective pictures and facial expressions, but positive and negative stimuli were separated between different participant groups. Emotional stimuli reduced the flanker effect as well as its sequential modulation regardless of valence. Experiments 5 and 6 used affective pictures but manipulated arousal and valence of stimuli orthogonally The results did not replicate the reduced flanker effect or sequential modulation by valence, nor did they show consistent effects of arousal. Experiment 7 used a mood induction technique and showed that sequential modulation was positively correlated with valence rating (the higher the more positive) but was negatively correlated with arousal rating. These results are inconsistent with several previous findings and are difficult to reconcile within a single theoretical framework, confirming an elusive nature of the emotional basis of cognitive control in the flanker task.


Author(s):  
Izabela Krejtz ◽  
Krzysztof Krejtz ◽  
Katarzyna Wisiecka ◽  
Marta Abramczyk ◽  
Michał Olszanowski ◽  
...  

Abstract The enhancement hypothesis suggests that deaf individuals are more vigilant to visual emotional cues than hearing individuals. The present eye-tracking study examined ambient–focal visual attention when encoding affect from dynamically changing emotional facial expressions. Deaf (n = 17) and hearing (n = 17) individuals watched emotional facial expressions that in 10-s animations morphed from a neutral expression to one of happiness, sadness, or anger. The task was to recognize emotion as quickly as possible. Deaf participants tended to be faster than hearing participants in affect recognition, but the groups did not differ in accuracy. In general, happy faces were more accurately and more quickly recognized than faces expressing anger or sadness. Both groups demonstrated longer average fixation duration when recognizing happiness in comparison to anger and sadness. Deaf individuals directed their first fixations less often to the mouth region than the hearing group. During the last stages of emotion recognition, deaf participants exhibited more focal viewing of happy faces than negative faces. This pattern was not observed among hearing individuals. The analysis of visual gaze dynamics, switching between ambient and focal attention, was useful in studying the depth of cognitive processing of emotional information among deaf and hearing individuals.


2016 ◽  
Vol 29 (8) ◽  
pp. 749-771 ◽  
Author(s):  
Min Hooi Yong ◽  
Ted Ruffman

Dogs respond to human emotional expressions. However, it is unknown whether dogs can match emotional faces to voices in an intermodal matching task or whether they show preferences for looking at certain emotional facial expressions over others, similar to human infants. We presented 52 domestic dogs and 24 seven-month-old human infants with two different human emotional facial expressions of the same gender simultaneously, while listening to a human voice expressing an emotion that matched one of them. Consistent with most matching studies, neither dogs nor infants looked longer at the matching emotional stimuli, yet dogs and humans demonstrated an identical pattern of looking less at sad faces when paired with happy or angry faces (irrespective of the vocal stimulus), with no preference for happyversusangry faces. Discussion focuses on why dogs and infants might have an aversion to sad faces, or alternatively, heightened interest in angry and happy faces.


2021 ◽  
Vol 12 ◽  
Author(s):  
Agnes Bohne ◽  
Dag Nordahl ◽  
Åsne A. W. Lindahl ◽  
Pål Ulvenes ◽  
Catharina E. A. Wang ◽  
...  

Processing of emotional facial expressions is of great importance in interpersonal relationships. Aberrant engagement with facial expressions, particularly an engagement with sad faces, loss of engagement with happy faces, and enhanced memory of sadness has been found in depression. Since most studies used adult faces, we here examined if such biases also occur in processing of infant faces in those with depression or depressive symptoms. In study 1, we recruited 25 inpatient women with major depression and 25 matched controls. In study 2, we extracted a sample of expecting parents from the NorBaby study, where 29 reported elevated levels of depressive symptoms, and 29 were matched controls. In both studies, we assessed attentional bias with a dot-probe task using happy, sad and neutral infant faces, and facial memory bias with a recognition task using happy, sad, angry, afraid, surprised, disgusted and neutral infant and adult faces. Participants also completed the Ruminative Responses Scale and Becks Depression Inventory-II. In study 1, we found no group difference in either attention to or memory accuracy for emotional infant faces. Neither attention nor recognition was associated with rumination. In study 2, we found that the group with depressive symptoms disengaged more slowly than healthy controls from sad infant faces, and this was related to rumination. The results place emphasis on the importance of emotional self-relevant material when examining cognitive processing in depression. Together, these studies demonstrate that a mood-congruent attentional bias to infant faces is present in expecting parents with depressive symptoms, but not in inpatients with Major Depression Disorder who do not have younger children.


2017 ◽  
Vol 1 (S1) ◽  
pp. 64-64
Author(s):  
Kelly Rowe Bijanki ◽  
Jon Willie ◽  
Helen Mayberg ◽  
Jess Fiedorowicz ◽  
Christopher Kovach ◽  
...  

OBJECTIVES/SPECIFIC AIMS: Deep brain stimulation is currently being evaluated as an experimental therapy for various psychiatric disorders, as well as being investigated as a method for mapping emotional brain functions. This growing area of research requires sensitive measures to quantify effects of stimulation on emotional processing. The current study examined the effects of acute stimulation to 2 limbic regions—the subcallosal cingulate (SCC) and the amygdala—on bias in the perception and evaluation of emotional facial expressions. We hypothesized that transient electrical stimulation to the limbic system would produce acute reductions in negative bias, consistent with its antidepressant effects in patients with severe depression. METHODS/STUDY POPULATION: The current study uses a novel affective bias task, developed to rapidly and covertly quantify emotional state. Over 4–6 minutes, patients rate the intensity and valence of static images of emotional facial expressions. We examined effects of electrical brain stimulation in 2 groups: patients with treatment-refractory depression undergoing SCC DBS therapy, and epilepsy patients undergoing amygdala stimulation via stereo-EEG electrodes during inpatient intracranial monitoring. DBS patients completed the task under stimulation and sham conditions during monthly visits over the first 6 months of therapy, as well as daily during a 1 week, blinded period of DBS discontinuation at the 6-month time point. Epilepsy patients completed the task under stimulation and sham conditions at a single visit. Mixed linear models and paired-samples t-test were used to investigate effects of stimulation as well as depression scale scores on affective bias ratings. RESULTS/ANTICIPATED RESULTS: Four SCC DBS patients showed significant effects of stimulation (p<0.0001) and depressive state (p<0.0001) on affective bias scores across 6 months of chronic DBS therapy, where emotional faces were perceived as less sad with stimulation ON, as well as during visits in which patients were nondepressed (typically later in the treatment course). Furthermore, 2 DBS patients showed rapid negative shifts in bias following acute blinded discontinuation of chronic stimulation, an effect which persisted over the 1-week period of discontinuation (t29=−2.58, p=0.015), in the absence of any self-reported change in mood. Likewise, 6 epilepsy patients showed significant positive shifts in affective bias with acute amygdala stimulation (t5=−4.75, p=0.005). Current analyses are investigating electrophysiological, autonomic and facial motor correlates to affective bias in these patients. DISCUSSION/SIGNIFICANCE OF IMPACT: Affective bias has revealed rapid, significant changes with stimulation at 2 limbic targets—one a white matter hub and one a nuclear subcortical structure—suggesting the task’s utility as an emotional outcome measure in brain stimulation studies. These stimulation-sensitive measures may provide a new metric to track treatment response to deep brain stimulation therapy for affective disorders. Future studies will determine whether affective bias can predict neuropsychiatric complications in patients undergoing stimulation mapping of brain circuitry ahead of resection surgery for epilepsy.


2018 ◽  
Vol 24 (7) ◽  
pp. 673-683 ◽  
Author(s):  
Frédérike Carrier-Toutant ◽  
Samuel Guay ◽  
Christelle Beaulieu ◽  
Édith Léveillé ◽  
Alexandre Turcotte-Giroux ◽  
...  

AbstractObjectives: Concussions affect the processing of emotional stimuli. This study aimed to investigate how sex interacts with concussion effects on early event-related brain potentials (ERP) measures (P1, N1) of emotional facial expressions (EFE) processing in asymptomatic, multi-concussion athletes during an EFE identification task. Methods: Forty control athletes (20 females and 20 males) and 43 multi-concussed athletes (22 females and 21 males), recruited more than 3 months after their last concussion, were tested. Participants completed the Beck Depression Inventory II, the Beck Anxiety Inventory, the Post-Concussion Symptom Scale, and an Emotional Facial Expression Identification Task. Pictures of male and female faces expressing neutral, angry, and happy emotions were randomly presented and the emotion depicted had to be identified as fast as possible during EEG acquisition. Results: Relative to controls, concussed athletes of both sex exhibited a significant suppression of P1 amplitude recorded from the dominant right hemisphere while performing the emotional face expression identification task. The present study also highlighted a sex-specific suppression of the N1 component amplitude after concussion which affected male athletes. Conclusions: These findings suggest that repeated concussions alter the typical pattern of right-hemisphere response dominance to EFE in early stages of EFE processing and that the neurophysiological mechanisms underlying the processing of emotional stimuli are distinctively affected across sex. (JINS, 2018, 24, 1–11)


2009 ◽  
Vol 37 (4) ◽  
pp. 491-501 ◽  
Author(s):  
Wataru Sato ◽  
Sakiko Yoshikawa

The perceptual/cognitive processing for emotional facial expressions is effective compared to that for neutral facial expressions. To investigate whether this effectiveness can be attributed to the expression of emotion or to the visual properties of the facial expressions, we used computer morphing to develop a form of control stimuli. These "anti-expressions" changed the features in emotional facial expressions in the opposite direction from neutral expressions by amounts equivalent to the differences between emotional and neutral expressions. To examine if anti-expressions are usable as emotionally neutral faces, 35 participants were asked to categorize and rate the valence and arousal dimensions of six basic emotions for normal and anti-expressions. The results indicate that anti-expressions were assessed as neutral for anger, disgust, fear, and happiness, and these can be used as control stimuli in emotional facial expressions regarding visual properties.


2016 ◽  
Author(s):  
Martial Mermillod ◽  
Delphine Grynberg ◽  
Magdalena Rychlowska ◽  
Nicolas Vermeulen ◽  
Paula M. Niedenthal ◽  
...  

AbstractIn the past decade, different studies have suggested that high-order factors could influence the perceptual processing of emotional stimuli. In this study, we aimed to evaluate the effect of congruent vs. incongruent social information (positive, negative or no information related to the character of the target) on subjective (perceived and felt valence and arousal), physiological (facial mimicry) as well as on neural (P100 and N170) responses to dynamic emotoional facial expressions (EFE) that varied from neutral to one of the six basic emotions. Across three studies, the results showed (1) reduced valence and arousal evaluation of EFE when associated with incongruent social information (Study 1), (2) increased electromyographical responses (Study 2) and significant modulation of P100 and N170 components (Study 3) when EFE were associated with social (positive and negative) information (vs. no information). These studies revealed that positive or negative social information reduced subjective responses to incongruent EFE and produces a similar neural and physiological boost of the early perceptual processing of EFE irrespective of their congruency. In conclusion, this study suggested that social context (positive or negative) enhances the necessity to be alert to any subsequent cues.


2019 ◽  
Vol 28 (4) ◽  
pp. 1411-1431 ◽  
Author(s):  
Lauren Bislick ◽  
William D. Hula

Purpose This retrospective analysis examined group differences in error rate across 4 contextual variables (clusters vs. singletons, syllable position, number of syllables, and articulatory phonetic features) in adults with apraxia of speech (AOS) and adults with aphasia only. Group differences in the distribution of error type across contextual variables were also examined. Method Ten individuals with acquired AOS and aphasia and 11 individuals with aphasia participated in this study. In the context of a 2-group experimental design, the influence of 4 contextual variables on error rate and error type distribution was examined via repetition of 29 multisyllabic words. Error rates were analyzed using Bayesian methods, whereas distribution of error type was examined via descriptive statistics. Results There were 4 findings of robust differences between the 2 groups. These differences were found for syllable position, number of syllables, manner of articulation, and voicing. Group differences were less robust for clusters versus singletons and place of articulation. Results of error type distribution show a high proportion of distortion and substitution errors in speakers with AOS and a high proportion of substitution and omission errors in speakers with aphasia. Conclusion Findings add to the continued effort to improve the understanding and assessment of AOS and aphasia. Several contextual variables more consistently influenced breakdown in participants with AOS compared to participants with aphasia and should be considered during the diagnostic process. Supplemental Material https://doi.org/10.23641/asha.9701690


2003 ◽  
Vol 17 (3) ◽  
pp. 113-123 ◽  
Author(s):  
Jukka M. Leppänen ◽  
Mirja Tenhunen ◽  
Jari K. Hietanen

Abstract Several studies have shown faster choice-reaction times to positive than to negative facial expressions. The present study examined whether this effect is exclusively due to faster cognitive processing of positive stimuli (i.e., processes leading up to, and including, response selection), or whether it also involves faster motor execution of the selected response. In two experiments, response selection (onset of the lateralized readiness potential, LRP) and response execution (LRP onset-response onset) times for positive (happy) and negative (disgusted/angry) faces were examined. Shorter response selection times for positive than for negative faces were found in both experiments but there was no difference in response execution times. Together, these results suggest that the happy-face advantage occurs primarily at premotoric processing stages. Implications that the happy-face advantage may reflect an interaction between emotional and cognitive factors are discussed.


Sign in / Sign up

Export Citation Format

Share Document