scholarly journals Left Amygdala Regulates the Cerebral Reading Network During Fast Emotion Word Processing

2020 ◽  
Vol 11 ◽  
Author(s):  
Kimihiro Nakamura ◽  
Tomoe Inomata ◽  
Akira Uno
Keyword(s):  
2021 ◽  
Vol 11 (5) ◽  
pp. 553
Author(s):  
Chenggang Wu ◽  
Juan Zhang ◽  
Zhen Yuan

In order to explore the affective priming effect of emotion-label words and emotion-laden words, the current study used unmasked (Experiment 1) and masked (Experiment 2) priming paradigm by including emotion-label words (e.g., sadness, anger) and emotion-laden words (e.g., death, gift) as primes and examined how the two kinds of words acted upon the processing of the target words (all emotion-laden words). Participants were instructed to decide the valence of target words, and their electroencephalogram was recorded at the same time. The behavioral and event-related potential (ERP) results showed that positive words produced a priming effect whereas negative words inhibited target word processing (Experiment 1). In Experiment 2, the inhibition effect of negative emotion-label words on emotion word recognition was found in both behavioral and ERP results, suggesting that modulation of emotion word type on emotion word processing could be observed even in the masked priming paradigm. The two experiments further supported the necessity of defining emotion words under an emotion word type perspective. The implications of the findings are proffered. Specifically, a clear understanding of emotion-label words and emotion-laden words can improve the effectiveness of emotional communications in clinical settings. Theoretically, the emotion word type perspective awaits further explorations and is still at its infancy.


2015 ◽  
Vol 6 ◽  
Author(s):  
Sara C. Sereno ◽  
Graham G. Scott ◽  
Bo Yao ◽  
Elske J. Thaden ◽  
Patrick J. O'Donnell
Keyword(s):  

2020 ◽  
Vol 63 (3) ◽  
pp. 896-912
Author(s):  
Yi Lin ◽  
Hongwei Ding ◽  
Yang Zhang

Purpose Emotional speech communication involves multisensory integration of linguistic (e.g., semantic content) and paralinguistic (e.g., prosody and facial expressions) messages. Previous studies on linguistic versus paralinguistic salience effects in emotional speech processing have produced inconsistent findings. In this study, we investigated the relative perceptual saliency of emotion cues in cross-channel auditory alone task (i.e., semantics–prosody Stroop task) and cross-modal audiovisual task (i.e., semantics–prosody–face Stroop task). Method Thirty normal Chinese adults participated in two Stroop experiments with spoken emotion adjectives in Mandarin Chinese. Experiment 1 manipulated auditory pairing of emotional prosody (happy or sad) and lexical semantic content in congruent and incongruent conditions. Experiment 2 extended the protocol to cross-modal integration by introducing visual facial expression during auditory stimulus presentation. Participants were asked to judge emotional information for each test trial according to the instruction of selective attention. Results Accuracy and reaction time data indicated that, despite an increase in cognitive demand and task complexity in Experiment 2, prosody was consistently more salient than semantic content for emotion word processing and did not take precedence over facial expression. While congruent stimuli enhanced performance in both experiments, the facilitatory effect was smaller in Experiment 2. Conclusion Together, the results demonstrate the salient role of paralinguistic prosodic cues in emotion word processing and congruence facilitation effect in multisensory integration. Our study contributes tonal language data on how linguistic and paralinguistic messages converge in multisensory speech processing and lays a foundation for further exploring the brain mechanisms of cross-channel/modal emotion integration with potential clinical applications.


2006 ◽  
Author(s):  
Graham G. Scott ◽  
Patrick J. O'Donnell ◽  
Sara C. Sereno

Sign in / Sign up

Export Citation Format

Share Document