An ERP Study on Facial Expression of Emotion: Comparison of Linguistic and Visual Semantic Decoding

2005 ◽  
Vol 100 (1) ◽  
pp. 129-134 ◽  
Author(s):  
Michela Balconi

The present research compared the semantic information processing of linguistic stimuli with semantic elaboration of nonlinguistic facial stimuli. To explore brain potentials (ERPs, event-related potentials) related to decoding facial expressions and the effect of semantic valence of the stimulus, we analyzed data for 20 normal subjects ( M age = 23.6 yr., SD = 0.2). Faces with three basic emotional expressions (fear, happiness, and sadness from the 1976 Ekman and Friesen database), with three semantically anomalous expressions (with respect to their emotional content), and the neutral stimuli (face without an emotional content) were presented in a random order. Differences in peak amplitude of ERP were observed later for anomalous expressions compared with congruous expressions. In fact, the results demonstrated that the emotional anomalous faces elicited a higher negative peak at about 360 msec., distributed mainly over the posterior sites. The observed electrophysiological activity may represent specific cognitive processing underlying the comprehension of facial expressions in detection of semantic anomaly. The evidence is in favour of comparability of this negative deflection with the N400 ERP effect elicited by linguistic anomalies.

Author(s):  
Michela Balconi

Neuropsychological studies have underlined the significant presence of distinct brain correlates deputed to analyze facial expression of emotion. It was observed that some cerebral circuits were considered as specific for emotional face comprehension as a function of conscious vs. unconscious processing of emotional information. Moreover, the emotional content of faces (i.e. positive vs. negative; more or less arousing) may have an effect in activating specific cortical networks. Between the others, recent studies have explained the contribution of hemispheres in comprehending face, as a function of type of emotions (mainly related to the distinction positive vs. negative) and of specific tasks (comprehending vs. producing facial expressions). Specifically, ERPs (event-related potentials) analysis overview is proposed in order to comprehend how face may be processed by an observer and how he can make face a meaningful construct even in absence of awareness. Finally, brain oscillations is considered in order to explain the synchronization of neural populations in response to emotional faces when a conscious vs. unconscious processing is activated.


2020 ◽  
Vol 15 (5) ◽  
pp. 551-560
Author(s):  
Katharina M Rischer ◽  
Mattias Savallampi ◽  
Anushka Akwaththage ◽  
Nicole Salinas Thunell ◽  
Carl Lindersson ◽  
...  

Abstract In this study, we explored how contextual information about threat dynamics affected the electrophysiological correlates of face perception. Forty-six healthy native Swedish speakers read verbal descriptions signaling an immediate vs delayed intent to escalate or deescalate an interpersonal conflict. Each verbal description was followed by a face with an angry or neutral expression, for which participants rated valence and arousal. Affective ratings confirmed that the emotional intent expressed in the descriptions modulated emotional reactivity to the facial stimuli in the expected direction. The electrophysiological data showed that compared to neutral faces, angry faces resulted in enhanced early and late event-related potentials (VPP, P300 and LPP). Additionally, emotional intent and temporal immediacy modulated the VPP and P300 similarly across angry and neutral faces, suggesting that they influence early face perception independently of facial affect. By contrast, the LPP amplitude to faces revealed an interaction between facial expression and emotional intent. Deescalating descriptions eliminated the LPP differences between angry and neutral faces. Together, our results suggest that information about a person’s intentions modulates the processing of facial expressions.


2006 ◽  
Vol 20 (1) ◽  
pp. 21-31 ◽  
Author(s):  
Bruce D. Dick ◽  
John F. Connolly ◽  
Michael E. Houlihan ◽  
Patrick J. McGrath ◽  
G. Allen Finley ◽  
...  

Abstract: Previous research has found that pain can exert a disruptive effect on cognitive processing. This experiment was conducted to extend previous research with participants with chronic pain. This report examines pain's effects on early processing of auditory stimulus differences using the Mismatch Negativity (MMN) in healthy participants while they experienced experimentally induced pain. Event-related potentials (ERPs) were recorded using target and standard tones whose pitch differences were easy- or difficult-to-detect in conditions where participants attended to (active attention) or ignored (passive attention) the stimuli. Both attention manipulations were conducted in no pain and pain conditions. Experimentally induced ischemic pain did not disrupt the MMN. However, MMN amplitudes were larger to difficult-to-detect deviant tones during painful stimulation when they were attended than when they were ignored. Also, MMN amplitudes were larger to the difficult- than to the easy-to-detect tones in the active attention condition regardless of pain condition. It appears that rather than exerting a disruptive effect, the presence of experimentally induced pain enhanced early processing of small stimulus differences in these healthy participants.


2020 ◽  
Author(s):  
Joshua W Maxwell ◽  
Eric Ruthruff ◽  
michael joseph

Are facial expressions of emotion processed automatically? Some authors have not found this to be the case (Tomasik et al., 2009). Here we revisited the question with a novel experimental logic – the backward correspondence effect (BCE). In three dual-task studies, participants first categorized a sound (Task 1) and then indicated the location of a target face (Task 2). In Experiment 1, Task 2 required participants to search for one facial expression of emotion (angry or happy). We observed positive BCEs, indicating that facial expressions of emotion bypassed the central attentional bottleneck and thus were processed in a capacity-free, automatic manner. In Experiment 2, we replicated this effect but found that morphed emotional expressions (which were used by Tomasik) were not processed automatically. In Experiment 3, we observed similar BCEs for another type of face processing previously shown to be capacity-free – identification of familiar faces (Jung et al., 2013). We conclude that facial expressions of emotion are identified automatically when sufficiently unambiguous.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Betina Korka ◽  
Erich Schröger ◽  
Andreas Widmann

AbstractOur brains continuously build and update predictive models of the world, sources of prediction being drawn for example from sensory regularities and/or our own actions. Yet, recent results in the auditory system indicate that stochastic regularities may not be easily encoded when a rare medium pitch deviant is presented between frequent high and low pitch standard sounds in random order, as reflected in the lack of sensory prediction error event-related potentials [i.e., mismatch negativity (MMN)]. We wanted to test the implication of the predictive coding theory that predictions based on higher-order generative models—here, based on action intention, are fed top-down in the hierarchy to sensory levels. Participants produced random sequences of high and low pitch sounds by button presses in two conditions: In a “specific” condition, one button produced high and the other low pitch sounds; in an “unspecific” condition, both buttons randomly produced high or low-pitch sounds. Rare medium pitch deviants elicited larger MMN and N2 responses in the “specific” compared to the “unspecific” condition, despite equal sound probabilities. These results thus demonstrate that action-effect predictions can boost stochastic regularity-based predictions and engage higher-order deviance detection processes, extending previous notions on the role of action predictions at sensory levels.


2009 ◽  
Vol 21 (1) ◽  
pp. 93-104 ◽  
Author(s):  
Redmond G. O'Connell ◽  
Paul M. Dockree ◽  
Mark A. Bellgrove ◽  
Alessandra Turin ◽  
Seamus Ward ◽  
...  

Disentangling the component processes that contribute to human executive control is a key challenge for cognitive neuroscience. Here, we employ event-related potentials to provide electrophysiological evidence that action errors during a go/no-go task can result either from sustained attention failures or from failures of response inhibition, and that these two processes are temporally and physiologically dissociable, although the behavioral error—a nonintended response—is the same. Thirteen right-handed participants performed a version of a go/no-go task in which stimuli were presented in a fixed and predictable order, thus encouraging attentional drift, and a second version in which an identical set of stimuli was presented in a random order, thus placing greater emphasis on response inhibition. Electrocortical markers associated with goal maintenance (late positivity, alpha synchronization) distinguished correct and incorrect performance in the fixed condition, whereas errors in the random condition were linked to a diminished N2–P3 inhibitory complex. In addition, the amplitude of the error-related negativity did not differ between correct and incorrect responses in the fixed condition, consistent with the view that errors in this condition do not arise from a failure to resolve response competition. Our data provide an electrophysiological dissociation of sustained attention and response inhibition.


2021 ◽  
Vol 12 ◽  
Author(s):  
Yutong Liu ◽  
Huini Peng ◽  
Jianhui Wu ◽  
Hongxia Duan

Background: Individuals exposed to childhood maltreatment present with a deficiency in emotional processing in later life. Most studies have focused mainly on childhood physical or sexual abuse; however, childhood emotional abuse, a core issue underlying different forms of childhood maltreatment, has received relatively little attention. The current study explored whether childhood emotional abuse is related to the impaired processing of emotional facial expressions in healthy young men.Methods: The emotional facial processing was investigated in a classical gender discrimination task while the event-related potentials (ERPs) data were collected. Childhood emotional abuse was assessed by a Childhood Trauma Questionnaire (CTQ) among 60 healthy young men. The relationship between the score of emotional abuse and the behavioral and the ERP index of emotional facial expression (angry, disgust, and happy) were explored.Results: Participants with a higher score of childhood emotional abuse responded faster on the behavioral level and had a smaller P2 amplitude on the neural level when processing disgust faces compared to neutral faces.Discussion: Individuals with a higher level of childhood emotional abuse may quickly identify negative faces with less cognitive resources consumed, suggesting altered processing of emotional facial expressions in young men with a higher level of childhood emotional abuse.


2020 ◽  
Author(s):  
Mareike J. Hülsemann ◽  
Björn Rasch

AbstractOur thoughts, plans and intentions can influence physiological sleep, but the underlying mechanisms are unknown. According to the theoretical framework of “embodied cognition”, the semantic content of cognitive processes is represented by multimodal networks in the brain which also include body-related functions. Such multimodal representation could offer a mechanism which explains mutual influences between cognition and sleep. In the current study we tested whether sleep-related words are represented in multimodal networks by examining the effect of congruent vs. incongruent body positions on word processing during wakefulness.We experimentally manipulated the body position of 66 subjects (50 females, 16 males, 19-40 years old) between standing upright and lying down. Sleep- and activity-related words were presented around the individual speech recognition threshold to increase task difficulty. Our results show that word processing is facilitated in congruent body positions (sleep words: lying down and activity words: standing upright) compared with incongruent body positions, as indicated by a reduced N400 of the event-related potential (ERP) in the congruent condition with the lowest volume. In addition, early sensory components of the ERP (N180 and P280) were enhanced, suggesting that words were also acoustically better understood when the body position was congruent with the semantic meaning of the word. However, the difference in ERPs did not translate to differences on a behavioural level.Our results support the prediction of embodied processing of sleep- and activity-related words. Body position potentially induces a pre-activation of multimodal networks, thereby enhancing the access to the semantic concepts of words related to current the body position. The mutual link between semantic meaning and body-related function could be a key element in explaining influences of cognitive processing on sleep.


Sign in / Sign up

Export Citation Format

Share Document