Synchrony and Emotion

2015 ◽  
Vol 3 (1-2) ◽  
pp. 102-115 ◽  
Author(s):  
Sylvie Droit-Volet ◽  
Joëlle Provasi

To anticipate other people’s behavioral intentions and respond to them at the right moment is crucial for efficient social interaction. In the present study, we thus investigated how adults synchronize with emotional facial expressions. The participants had to synchronize their taps with a rhythmical sequence of faces and then continue tapping at the same rhythm without faces. Three inter-stimulus intervals (500, 700, and 900 ms) and six different facial expressions (disgust, neutrality, sadness, joy, anger, and fear) were tested. In the synchronization phase, no difference was observed between the different facial expressions, suggesting that the participants tap in synchrony with external rhythms in the presence of stimuli whatever their emotional characteristics. However, in the continuation phase, an emotion effect emerged, with the individual rhythms being faster for the facial expressions of fear and, to a lesser extent, anger than for the other facial expressions. The motor rhythms were also longer and more variable for the disgusted faces. These findings suggest that the internal clock mechanism underlying the timing of rhythms is accelerated in response to the high-arousal emotions of fear and anger.

2009 ◽  
Vol 21 (7) ◽  
pp. 1321-1331 ◽  
Author(s):  
Martin Lotze ◽  
Matthias Reimold ◽  
Ulrike Heymans ◽  
Arto Laihinen ◽  
Marianne Patt ◽  
...  

Recent findings point to a perceptive impairment of emotional facial expressions in patients diagnosed with Parkinson disease (PD). In these patients, administration of dopamine can modulate emotional facial recognition. We used fMRI to investigate differences in the functional activation in response to emotional and nonemotional gestures between PD patients and age-matched healthy controls (HC). In addition, we used PET to evaluate the striatal dopamine transporter availability (DAT) with [11C]d-threo-methylphenidate in the patient group. Patients showed an average decrease to 26% in DAT when compared to age-corrected healthy references. Reduction in the DAT of the left putamen correlated not only with motor impairment but also with errors in emotional gesture recognition. In comparison to HC, PD patients showed a specific decrease in activation related to emotional gesture observation in the left ventrolateral prefrontal cortex (VLPFC) and the right superior temporal sulcus. Moreover, the less DAT present in the left putamen, the lower the activation in the left VLPFC. We conclude that a loss of dopaminergic neurotransmission in the putamen results in a reduction of ventrolateral prefrontal access involved in the recognition of emotional gestures.


Author(s):  
Chiara Ferrari ◽  
Lucile Gamond ◽  
Marcello Gallucci ◽  
Tomaso Vecchi ◽  
Zaira Cattaneo

Abstract. Converging neuroimaging and patient data suggest that the dorsolateral prefrontal cortex (DLPFC) is involved in emotional processing. However, it is still not clear whether the DLPFC in the left and right hemisphere is differentially involved in emotion recognition depending on the emotion considered. Here we used transcranial magnetic stimulation (TMS) to shed light on the possible causal role of the left and right DLPFC in encoding valence of positive and negative emotional facial expressions. Participants were required to indicate whether a series of faces displayed a positive or negative expression, while TMS was delivered over the right DLPFC, the left DLPFC, and a control site (vertex). Interfering with activity in both the left and right DLPFC delayed valence categorization (compared to control stimulation) to a similar extent irrespective of emotion type. Overall, we failed to demonstrate any valence-related lateralization in the DLPFC by using TMS. Possible methodological limitations are discussed.


2018 ◽  
Vol 5 (8) ◽  
pp. 180491 ◽  
Author(s):  
Christian Nawroth ◽  
Natalia Albuquerque ◽  
Carine Savalli ◽  
Marie-Sophie Single ◽  
Alan G. McElligott

Domestication has shaped the physiology and the behaviour of animals to better adapt to human environments. Therefore, human facial expressions may be highly informative for animals domesticated for working closely with people, such as dogs and horses. However, it is not known whether other animals, and particularly those domesticated primarily for production, such as goats, are capable of perceiving human emotional cues. In this study, we investigated whether goats can distinguish human facial expressions when simultaneously shown two images of an unfamiliar human with different emotional valences (positive/happy or negative/angry). Both images were vertically attached to a wall on one side of a test arena, 1.3 m apart, and goats were released from the opposite side of the arena (distance of 4.0 m) and were free to explore and interact with the stimuli during the trials. Each of four test trials lasted 30 s. Overall, we found that goats preferred to interact first with happy faces, meaning that they are sensitive to human facial emotional cues. Goats interacted first, more often and for longer duration with positive faces when they were positioned on the right side. However, no preference was found when the positive faces were placed on the left side. We show that animals domesticated for production can discriminate human facial expressions with different emotional valences and prefer to interact with positive ones. Therefore, the impact of domestication on animal cognitive abilities may be more far-reaching than previously assumed.


2011 ◽  
Vol 41 (11) ◽  
pp. 2375-2384 ◽  
Author(s):  
F. Ashworth ◽  
A. Pringle ◽  
R. Norbury ◽  
C. J. Harmer ◽  
P. J. Cowen ◽  
...  

BackgroundProcessing emotional facial expressions is of interest in eating disorders (EDs) as impairments in recognizing and understanding social cues might underlie the interpersonal difficulties experienced by these patients. Disgust and anger are of particular theoretical and clinical interest. The current study investigated the neural response to facial expressions of anger and disgust in bulimia nervosa (BN).MethodParticipants were 12 medication-free women with BN in an acute episode (mean age 24 years), and 16 age-, gender- and IQ-matched healthy volunteers (HVs). Functional magnetic resonance imaging (fMRI) was used to examine neural responses to angry and disgusted facial expressions.ResultsCompared with HVs, patients with BN had a decreased neural response in the precuneus to facial expressions of both anger and disgust and a decreased neural response to angry facial expressions in the right amygdala.ConclusionsThe neural response to emotional facial expressions in BN differs from that found in HVs. The precuneus response may be consistent with the application of mentalization theory to EDs, and the amygdala response with relevant ED theory. The findings are preliminary, but novel, and require replication in a larger sample.


2021 ◽  
Vol 11 (9) ◽  
pp. 1203 ◽  
Author(s):  
Sara Borgomaneri ◽  
Francesca Vitale ◽  
Simone Battaglia ◽  
Alessio Avenanti

The ability to rapidly process others’ emotional signals is crucial for adaptive social interactions. However, to date it is still unclear how observing emotional facial expressions affects the reactivity of the human motor cortex. To provide insights on this issue, we employed single-pulse transcranial magnetic stimulation (TMS) to investigate corticospinal motor excitability. Healthy participants observed happy, fearful and neutral pictures of facial expressions while receiving TMS over the left or right motor cortex at 150 and 300 ms after picture onset. In the early phase (150 ms), we observed an enhancement of corticospinal excitability for the observation of happy and fearful emotional faces compared to neutral expressions specifically in the right hemisphere. Interindividual differences in the disposition to experience aversive feelings (personal distress) in interpersonal emotional contexts predicted the early increase in corticospinal excitability for emotional faces. No differences in corticospinal excitability were observed at the later time (300 ms) or in the left M1. These findings support the notion that emotion perception primes the body for action and highlights the role of the right hemisphere in implementing a rapid and transient facilitatory response to emotional arousing stimuli, such as emotional facial expressions.


Author(s):  
Maida Koso-Drljević ◽  
Meri Miličević

The aim of the study was to test two assumptions about the lateralization of the processing of emotional facial expressions: the assumption of right hemisphere dominance and the valence assumption and to egsamine the influence of gender of the presented stimulus (chimera) and depression as an emotional state of participants. The sample consisted of 83 female students, with an average age of 20 years. Participants solved the Task of Recognizing Emotional Facial Expressions on a computer and then completed the DASS-21, Depression subscale. The results of the study partially confirmed the assumption of valence for the dependent variable - the accuracy of the response. Participants were recognizing more accurately the emotion of sadness than happiness when it is presented on the left side of the face, which is consistent with the valence hypothesis, according to which the right hemisphere is responsible for recognizing negative emotions. However, when it comes to the right side of the face, participants were equally accurately recognizing the emotion of sadness and happiness, which is not consistent with the valence hypothesis. The main effect of the gender of the chimera was statistically significant for the accuracy of the response, the recognition accuracy was higher for the male chimeras compared to the female. A statistically significant negative correlation was obtained between the variable sides of the face (left and right) with the achieved result on the depression subscale for the dependent variable - reaction time. The higher the score on the depressive subscale, the slower (longer) is reaction time to the presented chimera, both on the left and on the right.


2019 ◽  
Vol 9 (6) ◽  
pp. 142 ◽  
Author(s):  
Joanie Drapeau ◽  
Nathalie Gosselin ◽  
Isabelle Peretz ◽  
Michelle McKerral

The present study aimed to measure neural information processing underlying emotional recognition from facial expressions in adults having sustained a mild traumatic brain injury (mTBI) as compared to healthy individuals. We thus measured early (N1, N170) and later (N2) event-related potential (ERP) components during presentation of fearful, neutral, and happy facial expressions in 10 adults with mTBI and 11 control participants. Findings indicated significant differences between groups, irrespective of emotional expression, in the early attentional stage (N1), which was altered in mTBI. The two groups showed similar perceptual integration of facial features (N170), with greater amplitude for fearful facial expressions in the right hemisphere. At a higher-level emotional discrimination stage (N2), both groups demonstrated preferential processing for fear as compared to happiness and neutrality. These findings suggest a reduced early selective attentional processing following mTBI, but no impact on the perceptual and higher-level cognitive processes stages. This study contributes to further improving our comprehension of attentional versus emotional recognition following a mild TBI.


2007 ◽  
Vol 98 (1) ◽  
pp. 145-152 ◽  
Author(s):  
Miguel Fernandez Del Olmo ◽  
Binith Cheeran ◽  
Giacomo Koch ◽  
John C. Rothwell

Several studies have suggested that the cerebellum has an important role in timing of subsecond intervals. Previous studies using transcranial magnetic stimulation (TMS) to test this hypothesis directly have produced inconsistent results. Here we used 1-Hz repetitive TMS (rTMS) for 10 min over the right or left cerebellar hemisphere to interfere transiently with cerebellar processing to assess its effect on the performance of a finger-tapping task. Subjects tapped with their right index finger for 1 min (synchronization phase) with an auditory or visual cue at 0.5, 1, or 2 Hz; they continued for a further 1 min at the same rate with no cues (continuation phase). The blocks of trials were performed in a random order. rTMS of the cerebellum ipsilateral to the movement increased the variability of the intertap interval but only for movements at 2 Hz that were made while subjects were synchronizing with an auditory cue. There was no effect on the continuation phase of the task when the cues were no longer present or on synchronization with a visual cue. Similar results were seen after stimulation over the contralateral dorsal premotor cortex but not after rTMS over supplementary motor area. There was no effect after rTMS over the ipsilateral right cervical nerve roots or over the ipsilateral primary motor cortex. The results support the hypothesis of neural network for event-related timing in the subsecond range that involves a cerebellar-premotor network.


Neurocase ◽  
1997 ◽  
Vol 3 (4) ◽  
pp. 259-266 ◽  
Author(s):  
Godehard Weniger ◽  
Eva Irle ◽  
Cornelia Exner ◽  
Eckart Ruther

Sign in / Sign up

Export Citation Format

Share Document