scholarly journals Effects of Repeated Concussions and Sex on Early Processing of Emotional Facial Expressions as Revealed by Electrophysiology

2018 ◽  
Vol 24 (7) ◽  
pp. 673-683 ◽  
Author(s):  
Frédérike Carrier-Toutant ◽  
Samuel Guay ◽  
Christelle Beaulieu ◽  
Édith Léveillé ◽  
Alexandre Turcotte-Giroux ◽  
...  

AbstractObjectives: Concussions affect the processing of emotional stimuli. This study aimed to investigate how sex interacts with concussion effects on early event-related brain potentials (ERP) measures (P1, N1) of emotional facial expressions (EFE) processing in asymptomatic, multi-concussion athletes during an EFE identification task. Methods: Forty control athletes (20 females and 20 males) and 43 multi-concussed athletes (22 females and 21 males), recruited more than 3 months after their last concussion, were tested. Participants completed the Beck Depression Inventory II, the Beck Anxiety Inventory, the Post-Concussion Symptom Scale, and an Emotional Facial Expression Identification Task. Pictures of male and female faces expressing neutral, angry, and happy emotions were randomly presented and the emotion depicted had to be identified as fast as possible during EEG acquisition. Results: Relative to controls, concussed athletes of both sex exhibited a significant suppression of P1 amplitude recorded from the dominant right hemisphere while performing the emotional face expression identification task. The present study also highlighted a sex-specific suppression of the N1 component amplitude after concussion which affected male athletes. Conclusions: These findings suggest that repeated concussions alter the typical pattern of right-hemisphere response dominance to EFE in early stages of EFE processing and that the neurophysiological mechanisms underlying the processing of emotional stimuli are distinctively affected across sex. (JINS, 2018, 24, 1–11)

2021 ◽  
Vol 15 ◽  
Author(s):  
E. Darcy Burgund

Major theories of hemisphere asymmetries in facial expression processing predict right hemisphere dominance for negative facial expressions of disgust, fear, and sadness, however, some studies observe left hemisphere dominance for one or more of these expressions. Research suggests that tasks requiring the identification of six basic emotional facial expressions (angry, disgusted, fearful, happy, sad, and surprised) are more likely to produce left hemisphere involvement than tasks that do not require expression identification. The present research investigated this possibility in two experiments that presented six basic emotional facial expressions to the right or left hemisphere using a divided-visual field paradigm. In Experiment 1, participants identified emotional expressions by pushing a key corresponding to one of six labels. In Experiment 2, participants detected emotional expressions by pushing a key corresponding to whether an expression was emotional or not. In line with predictions, fearful facial expressions exhibited a left hemisphere advantage during the identification task but not during the detection task. In contrast to predictions, sad expressions exhibited a left hemisphere advantage during both identification and detection tasks. In addition, happy facial expressions exhibited a left hemisphere advantage during the detection task but not during the identification task. Only angry facial expressions exhibited a right hemisphere advantage, and this was only observed when data from both experiments were combined. Together, results highlight the influence of task demands on hemisphere asymmetries in facial expression processing and suggest a greater role for the left hemisphere in negative expressions than predicted by previous theories.


2020 ◽  
Author(s):  
Motonori Yamaguchi ◽  
Jack Dylan Moore ◽  
Sarah Hendry ◽  
Felicity Wolohan

The emotional basis of cognitive control has been investigated in the flanker task with various procedures and materials across different studies. The present study examined the issue with the same flanker task but with different types of emotional stimuli and design. In seven experiments, the flanker effect and its sequential modulation according to the preceding trial type were assessed. Experiments 1 and 2 used affective pictures and emotional facial expressions as emotional stimuli, and positive and negative stimuli were intermixed. There was little evidence that emotional stimuli influenced cognitive control. Experiments 3 and 4 used the same affective pictures and facial expressions, but positive and negative stimuli were separated between different participant groups. Emotional stimuli reduced the flanker effect as well as its sequential modulation regardless of valence. Experiments 5 and 6 used affective pictures but manipulated arousal and valence of stimuli orthogonally The results did not replicate the reduced flanker effect or sequential modulation by valence, nor did they show consistent effects of arousal. Experiment 7 used a mood induction technique and showed that sequential modulation was positively correlated with valence rating (the higher the more positive) but was negatively correlated with arousal rating. These results are inconsistent with several previous findings and are difficult to reconcile within a single theoretical framework, confirming an elusive nature of the emotional basis of cognitive control in the flanker task.


2016 ◽  
Vol 29 (8) ◽  
pp. 749-771 ◽  
Author(s):  
Min Hooi Yong ◽  
Ted Ruffman

Dogs respond to human emotional expressions. However, it is unknown whether dogs can match emotional faces to voices in an intermodal matching task or whether they show preferences for looking at certain emotional facial expressions over others, similar to human infants. We presented 52 domestic dogs and 24 seven-month-old human infants with two different human emotional facial expressions of the same gender simultaneously, while listening to a human voice expressing an emotion that matched one of them. Consistent with most matching studies, neither dogs nor infants looked longer at the matching emotional stimuli, yet dogs and humans demonstrated an identical pattern of looking less at sad faces when paired with happy or angry faces (irrespective of the vocal stimulus), with no preference for happyversusangry faces. Discussion focuses on why dogs and infants might have an aversion to sad faces, or alternatively, heightened interest in angry and happy faces.


Author(s):  
Chiara Ferrari ◽  
Lucile Gamond ◽  
Marcello Gallucci ◽  
Tomaso Vecchi ◽  
Zaira Cattaneo

Abstract. Converging neuroimaging and patient data suggest that the dorsolateral prefrontal cortex (DLPFC) is involved in emotional processing. However, it is still not clear whether the DLPFC in the left and right hemisphere is differentially involved in emotion recognition depending on the emotion considered. Here we used transcranial magnetic stimulation (TMS) to shed light on the possible causal role of the left and right DLPFC in encoding valence of positive and negative emotional facial expressions. Participants were required to indicate whether a series of faces displayed a positive or negative expression, while TMS was delivered over the right DLPFC, the left DLPFC, and a control site (vertex). Interfering with activity in both the left and right DLPFC delayed valence categorization (compared to control stimulation) to a similar extent irrespective of emotion type. Overall, we failed to demonstrate any valence-related lateralization in the DLPFC by using TMS. Possible methodological limitations are discussed.


2011 ◽  
Vol 26 (S2) ◽  
pp. 707-707
Author(s):  
X. Luo ◽  
R. Chen ◽  
W. Guo ◽  
H. Zhang ◽  
R. Zhou

IntroductionMost previous researches indicated that impaired inhibition to emotional stimuli could be one of the important cognitive characteristics of depression individuals. The antisaccade tasks which composed of prosaccade task (PS) and antisaccade task (AS) were often used to investigate response inhibition.AimsThis study aimed to investigate the volition inhibition toward emotional stimuli in depressed mood undergraduates (DM).MethodsSubjects were grouped as 21 DM and 25 non-depressed undergraduates (ND) on the Beck Depression Inventory and Self-rating Depression Scale. The antisaccade tasks were conducted to examine the inhibition abilities by varying the arousal level of volition (low and high) of the tasks, with happy, neutral and sad facial expressions as stimuli.ResultsThe results showed that at the low volition level in the AS condition, the correct saccade latency in the DM were significant slower than the ND; The DM had reliable higher direction error rates in response to emotional facial expressions, especially for sad expressions. However, all of the differences disappeared in the high volition level antisaccade tasks. The amplitude errors data were not influenced by emotional facial expressions, and there were no group differences across tasks.ConclusionsThese results indicated the DM showed slower speed of cognitive processing and impaired inhibition abilities toward emotional faces than the ND, particularly for sad faces, but these abilities will be repaired in the high arousal level of volition, which enlighten us that training the DM's volition level of inhibition could prove to be an effective strategy to alleviate depression.


2021 ◽  
Vol 11 (9) ◽  
pp. 1203 ◽  
Author(s):  
Sara Borgomaneri ◽  
Francesca Vitale ◽  
Simone Battaglia ◽  
Alessio Avenanti

The ability to rapidly process others’ emotional signals is crucial for adaptive social interactions. However, to date it is still unclear how observing emotional facial expressions affects the reactivity of the human motor cortex. To provide insights on this issue, we employed single-pulse transcranial magnetic stimulation (TMS) to investigate corticospinal motor excitability. Healthy participants observed happy, fearful and neutral pictures of facial expressions while receiving TMS over the left or right motor cortex at 150 and 300 ms after picture onset. In the early phase (150 ms), we observed an enhancement of corticospinal excitability for the observation of happy and fearful emotional faces compared to neutral expressions specifically in the right hemisphere. Interindividual differences in the disposition to experience aversive feelings (personal distress) in interpersonal emotional contexts predicted the early increase in corticospinal excitability for emotional faces. No differences in corticospinal excitability were observed at the later time (300 ms) or in the left M1. These findings support the notion that emotion perception primes the body for action and highlights the role of the right hemisphere in implementing a rapid and transient facilitatory response to emotional arousing stimuli, such as emotional facial expressions.


Author(s):  
Maida Koso-Drljević ◽  
Meri Miličević

The aim of the study was to test two assumptions about the lateralization of the processing of emotional facial expressions: the assumption of right hemisphere dominance and the valence assumption and to egsamine the influence of gender of the presented stimulus (chimera) and depression as an emotional state of participants. The sample consisted of 83 female students, with an average age of 20 years. Participants solved the Task of Recognizing Emotional Facial Expressions on a computer and then completed the DASS-21, Depression subscale. The results of the study partially confirmed the assumption of valence for the dependent variable - the accuracy of the response. Participants were recognizing more accurately the emotion of sadness than happiness when it is presented on the left side of the face, which is consistent with the valence hypothesis, according to which the right hemisphere is responsible for recognizing negative emotions. However, when it comes to the right side of the face, participants were equally accurately recognizing the emotion of sadness and happiness, which is not consistent with the valence hypothesis. The main effect of the gender of the chimera was statistically significant for the accuracy of the response, the recognition accuracy was higher for the male chimeras compared to the female. A statistically significant negative correlation was obtained between the variable sides of the face (left and right) with the achieved result on the depression subscale for the dependent variable - reaction time. The higher the score on the depressive subscale, the slower (longer) is reaction time to the presented chimera, both on the left and on the right.


2020 ◽  
Vol 34 (5) ◽  
pp. 677-698 ◽  
Author(s):  
Martin Vestergaard ◽  
Mickey T. Kongerslev ◽  
Marianne S. Thomsen ◽  
Birgit Bork Mathiesen ◽  
Catherine J. Harmer ◽  
...  

Individuals with borderline personality disorder (BPD) frequently display impairments in the identification of emotional facial expressions paralleled by a negativity bias. However, it remains unclear whether misperception of facial expressions is a key psychopathological marker of BPD. To address this question, the authors examined 43 women diagnosed with BPD and 56 healthy female controls using an emotion face identification task and a face dot-probe task together with measures on psychopathology. Compared to controls, women with BPD showed impaired identification of disgusted and angry faces concurrent with a bias to misclassify faces as angry, and a faster preconscious vigilance for fearful relative to happy facial expressions. Increased severity of borderline symptoms and global psychopathology in BPD patients were associated with reduced ability to identify angry facial expressions and a stronger negativity bias to anger. The findings indicate that BPD patients who misperceive face emotions have the greatest mental health issues.


2019 ◽  
Vol 9 (6) ◽  
pp. 142 ◽  
Author(s):  
Joanie Drapeau ◽  
Nathalie Gosselin ◽  
Isabelle Peretz ◽  
Michelle McKerral

The present study aimed to measure neural information processing underlying emotional recognition from facial expressions in adults having sustained a mild traumatic brain injury (mTBI) as compared to healthy individuals. We thus measured early (N1, N170) and later (N2) event-related potential (ERP) components during presentation of fearful, neutral, and happy facial expressions in 10 adults with mTBI and 11 control participants. Findings indicated significant differences between groups, irrespective of emotional expression, in the early attentional stage (N1), which was altered in mTBI. The two groups showed similar perceptual integration of facial features (N170), with greater amplitude for fearful facial expressions in the right hemisphere. At a higher-level emotional discrimination stage (N2), both groups demonstrated preferential processing for fear as compared to happiness and neutrality. These findings suggest a reduced early selective attentional processing following mTBI, but no impact on the perceptual and higher-level cognitive processes stages. This study contributes to further improving our comprehension of attentional versus emotional recognition following a mild TBI.


Perception ◽  
2020 ◽  
Vol 49 (8) ◽  
pp. 822-834
Author(s):  
Vivian Lee ◽  
Natasha Da Silva ◽  
M. D. Rutherford

Adults perceive a continuum of emotional facial expressions categorically rather than continuously. Categorical perception is thought to be adaptive and functional, allowing for inferences that inform behavior. To date, studies have demonstrated categorical perception of some emotional facial expressions in infants. However, a recent study reported that 12-month-olds infants do not perceive facial emotional expressions categorically across a happy–sad continuum. In contrast, toddlers at 3.5 years of age appear to use categorical perception along the happy–sad continuum. Using a novel paradigm that employed the use of a looking-time discrimination task and an explicit identification task, this study measured 26-month-old’s identification of faces and ability to discriminate between faces along a happy–sad continuum. Results suggest that 26-month-olds perceive facial expressions categorically along the happy–sad continuum.


Sign in / Sign up

Export Citation Format

Share Document