Music-like stimuli affect the emotional processing of words and facial expressions

2019 ◽  
pp. 030573561987580
Author(s):  
Daniel Andre Ignacio ◽  
David R Gerkens ◽  
Erick Ryan Aguinaldo ◽  
Dina Arch ◽  
Ruben Barajas

The present study used the affective priming paradigm to understand interference and facilitation effects of cross-modal emotional interactions; specifically, the ability for five-chord progressions to affect processing efficiency of visual targets. Twenty-five-chord progressions were selected based on the degree that they fulfilled participants’ automatically formulated expectations of how each musical sequence should sound. The current study is an extension of previous research that revealed the influence of music-like stimuli on the identification of valence in emotional words. The potential of music-like stimuli to affect emotional processing, as measured by the efficiency of valence categorization, was assessed across two experiments. Experiment 1 presented word-targets, whereas Experiment 2 presented facial expressions. The processing of words and faces primed with affectively congruent chord-progressions was facilitated, whereas the processing of words primed with affectively incongruent chord-progressions was not. Incongruent pairings with faces engendered interference effects and the second experiment revealed a predictive relationship between behavioral processing speed and self-ratings of anxiety. The processing of word-targets was compared to facial expressions in the presence and absence of music. The results suggest that short musical sequences influence individuals’ emotional processing, which could inform intervention research into how to attenuate potential attention biases.

2019 ◽  
Author(s):  
Brittney O'Neill

The effects of emoticons in textual computer-mediated communication (CMC) remain relatively unexplored. CMC researchers have suggested that emoticons behave much as do facial expressions in face-to-face interaction (e.g. Danet, Ruedenberg-Wright, & Rosenbaum-Tamari, 1997; Rezabek & Cochenour, 1998; Thompson & Foulger, 1996). Some fMRI research suggests, however, that there is not a direct neural correspondence between emoticons and facial expressions, but that emoticons play an important role in determining the positive or negative valence of an utterance (Yuasa, Saito, & Mukawa, 2011). Following the affective priming paradigm developed by Fazio, Sanbonmatsu, Powell, and Kardes (1986), this study explores the priming effects of emoticons vis-à-vis photographs of facial expression and emotional words on valence judgements of emotionally charged words. Significant main effects of age, prime valence, and target valence were found. There were also significant interactions between these three factors. Overall results suggest that younger and older participants have differing experiences of emoticons, with younger participants experiencing an effect of emoticons that is similar to the effect of facial expressions while older adults seem to experience emoticons in ways more like textual information or even just textual nonsense.


2002 ◽  
Vol 19 (1) ◽  
pp. 66-87
Author(s):  
David A. Yeigh

AbstractThis study investigated the effects of perceived controllobility on information processing within the attributional model of learning (Weiner, 1985, 1986). Attributional style was used to identify trait patterns of controllability for 37 university student. Task-relevant feedback was then manipulated to test for differences in working memory function between participants with high versus low levels of trait controllobility. Trait controllability occurred differently for hi-trait and lo-trait types. Results supported the hypothesis that it exerts a moderating effect on the way task-relevant feedback is processed. This selective encoding of information appeared to involve limitations inherent to the working memory system that affect processing efficiency, marking an important consideration for the way in which information is presented during the learning process.


2019 ◽  
Vol 48 (6) ◽  
pp. 836-845
Author(s):  
Lisa Thorpe ◽  
Margaret Cousins ◽  
Ros Bramwell

The phoneme monitoring task is a musical priming paradigm that demonstrates that both musicians and non-musicians have gained implicit understanding of prevalent harmonic structures. Little research has focused on implicit music learning in musicians and non-musicians. This current study aimed to investigate whether the phoneme monitoring task would identify any implicit memory differences between musicians and non-musicians. It focuses on both implicit knowledge of musical structure and implicit memory for specific musical sequences. Thirty-two musicians and non-musicians (19 female and 13 male) were asked to listen to a seven-chord sequence and decide as quickly as possible whether the final chord ended on the syllable /di/ or /du/. Overall, musicians were faster at the task, though non-musicians made more gains through the blocks of trials. Implicit memory for musical sequence was evident in both musicians and non-musicians. Both groups of participants reacted quicker to sequences that they had heard more than once but showed no explicit knowledge of the familiar sequences.


2010 ◽  
Vol 41 (4) ◽  
pp. 779-788 ◽  
Author(s):  
G. Lelli-Chiesa ◽  
M. J. Kempton ◽  
J. Jogia ◽  
R. Tatarelli ◽  
P. Girardi ◽  
...  

BackgroundThe Met allele of the catechol-O-methyltransferase (COMT) valine-to-methionine (Val158Met) polymorphism is known to affect dopamine-dependent affective regulation within amygdala–prefrontal cortical (PFC) networks. It is also thought to increase the risk of a number of disorders characterized by affective morbidity including bipolar disorder (BD), major depressive disorder (MDD) and anxiety disorders. The disease risk conferred is small, suggesting that this polymorphism represents a modifier locus. Therefore our aim was to investigate how the COMT Val158Met may contribute to phenotypic variation in clinical diagnosis using sad facial affect processing as a probe for its neural action.MethodWe employed functional magnetic resonance imaging to measure activation in the amygdala, ventromedial PFC (vmPFC) and ventrolateral PFC (vlPFC) during sad facial affect processing in family members with BD (n=40), MDD and anxiety disorders (n=22) or no psychiatric diagnosis (n=25) and 50 healthy controls.ResultsIrrespective of clinical phenotype, the Val158 allele was associated with greater amygdala activation and the Met158 allele with greater signal change in the vmPFC and vlPFC. Signal changes in the amygdala and vmPFC were not associated with disease expression. However, in the right vlPFC the Met158 allele was associated with greater activation in all family members with affective morbidity compared with relatives without a psychiatric diagnosis and healthy controls.ConclusionsOur results suggest that the COMT Val158Met polymorphism has a pleiotropic effect within the neural networks subserving emotional processing. Furthermore the Met158 allele further reduces cortical efficiency in the vlPFC in individuals with affective morbidity.


2012 ◽  
Vol 13 (2) ◽  
pp. 284-296 ◽  
Author(s):  
Luis Aguado ◽  
Teresa Dieguez-Risco ◽  
Constantino Méndez-Bértolo ◽  
Miguel A. Pozo ◽  
José A. Hinojosa

2014 ◽  
Vol 26 (4) ◽  
pp. 253-259 ◽  
Author(s):  
Linette Lawlor-Savage ◽  
Scott R. Sponheim ◽  
Vina M. Goghari

BackgroundThe ability to accurately judge facial expressions is important in social interactions. Individuals with bipolar disorder have been found to be impaired in emotion recognition; however, the specifics of the impairment are unclear. This study investigated whether facial emotion recognition difficulties in bipolar disorder reflect general cognitive, or emotion-specific, impairments. Impairment in the recognition of particular emotions and the role of processing speed in facial emotion recognition were also investigated.MethodsClinically stable bipolar patients (n = 17) and healthy controls (n = 50) judged five facial expressions in two presentation types, time-limited and self-paced. An age recognition condition was used as an experimental control.ResultsBipolar patients’ overall facial recognition ability was unimpaired. However, patients’ specific ability to judge happy expressions under time constraints was impaired.ConclusionsFindings suggest a deficit in happy emotion recognition impacted by processing speed. Given the limited sample size, further investigation with a larger patient sample is warranted.


2018 ◽  
Vol 8 (12) ◽  
pp. 219 ◽  
Author(s):  
Mayra Gutiérrez-Muñoz ◽  
Martha Fajardo-Araujo ◽  
Erika González-Pérez ◽  
Victor Aguirre-Arzola ◽  
Silvia Solís-Ortiz

Polymorphisms of the estrogen receptor ESR1 and ESR2 genes have been linked with cognitive deficits and affective disorders. The effects of these genetic variants on emotional processing in females with low estrogen levels are not well known. The aim was to explore the impact of the ESR1 and ESR2 genes on the responses to the facial emotion recognition task in females. Postmenopausal healthy female volunteers were genotyped for the polymorphisms Xbal and PvuII of ESR1 and the polymorphism rs1256030 of ESR2. The effect of these polymorphisms on the response to the facial emotion recognition of the emotions happiness, sadness, disgust, anger, surprise, and fear was analyzed. Females carrying the P allele of the PvuII polymorphism or the X allele of the Xbal polymorphism of ESR1 easily recognized facial expressions of sadness that were more difficult for the women carrying the p allele or the x allele. They displayed higher accuracy, fast response time, more correct responses, and fewer omissions to complete the task, with a large effect size. Women carrying the ESR2 C allele of ESR2 showed a faster response time for recognizing facial expressions of anger. These findings link ESR1 and ESR2 polymorphisms in facial emotion recognition of negative emotions.


2010 ◽  
Vol 69 (4) ◽  
pp. 213-219 ◽  
Author(s):  
Inès Skandrani-Marzouki ◽  
Yousri Marzouki

The present study examines the unconscious influence of emotional information on decision making in a simulated hiring situation. We used a subliminal masked priming paradigm with varying faces as primes, which were presented for a duration of 50 ms and had two levels of emotion: positive emotion (happiness) and negative emotion (anger). These primes were followed by emotionally neutral target faces. Primes were congruent (same faces) or incongruent (different faces). Prime Emotion (positive vs. negative) was crossed with Prime Repetition (repeat vs. unrelated) in a 2 × 2 factorial design. Each participant was tested in all four of the experimental conditions, each of which had 5 different trials. The participants were asked to indicate as rapidly as possible whether they were “favorable” or “unfavorable” toward the selection of the candidate (target face). Two dependent measures were analyzed: number of target faces chosen (i.e., number of “favorable” responses to target faces) and reaction time (RT). Results revealed a strong effect of emotional priming. Participants tended to choose more target faces preceded by positive prime faces than by negative prime faces. Moreover, they reacted faster when presented with target faces preceded by negative primes. Despite its exploratory nature, this study provides further evidence for the role of emotional processing in modulating decision processes and extends the experimental manipulation of subliminal emotion to the case of the masked repetition priming technique.


2016 ◽  
Vol 30 (3) ◽  
pp. 114-123 ◽  
Author(s):  
Tokiko Harada ◽  
Akiko Hayashi ◽  
Norihiro Sadato ◽  
Tetsuya Iidaka

Abstract. Facial expressions play a significant role in displaying feelings. A person’s facial expression automatically induces a similar emotional feeling in an observer; this phenomenon is known as emotional contagion. However, little is known about the neural mechanisms underlying such emotional responses. We conducted an event-related functional magnetic resonance imaging (fMRI) study to examine the neural substrates involved in automatic responses and emotional feelings induced by movies of another person’s happy and sad facial expressions. The fMRI data revealed observing happiness (vs. sadness) evoked activity in the left anterior cingulate gyrus, which is known to be responsible for positive emotional processing and fear inhibition. Conversely, observing sadness (vs. happiness) increased activity in the right superior temporal sulcus and bilateral inferior parietal lobes, which have been reported to be involved in negative emotional processing and the representation of facial movements. In addition, both expressions evoked activity in the right inferior frontal gyrus. These patterns of activity suggest that the observation of dynamic facial expressions automatically elicited dissociable and partially overlapping responses for happy and sad emotions.


Sign in / Sign up

Export Citation Format

Share Document