scholarly journals Joint Modulation of Facial Expression Processing by Contextual Congruency and Task Demands

2019 ◽  
Vol 9 (5) ◽  
pp. 116 ◽  
Author(s):  
Luis Aguado ◽  
Karisa Parkington ◽  
Teresa Dieguez-Risco ◽  
José Hinojosa ◽  
Roxane Itier

Faces showing expressions of happiness or anger were presented together with sentences that described happiness-inducing or anger-inducing situations. Two main variables were manipulated: (i) congruency between contexts and expressions (congruent/incongruent) and (ii) the task assigned to the participant, discriminating the emotion shown by the target face (emotion task) or judging whether the expression shown by the face was congruent or not with the context (congruency task). Behavioral and electrophysiological results (event-related potentials (ERP)) showed that processing facial expressions was jointly influenced by congruency and task demands. ERP results revealed task effects at frontal sites, with larger positive amplitudes between 250–450 ms in the congruency task, reflecting the higher cognitive effort required by this task. Effects of congruency appeared at latencies and locations corresponding to the early posterior negativity (EPN) and late positive potential (LPP) components that have previously been found to be sensitive to emotion and affective congruency. The magnitude and spatial distribution of the congruency effects varied depending on the task and the target expression. These results are discussed in terms of the modulatory role of context on facial expression processing and the different mechanisms underlying the processing of expressions of positive and negative emotions.

2021 ◽  
pp. 095679762199666
Author(s):  
Sebastian Schindler ◽  
Maximilian Bruchmann ◽  
Claudia Krasowski ◽  
Robert Moeck ◽  
Thomas Straube

Our brains rapidly respond to human faces and can differentiate between many identities, retrieving rich semantic emotional-knowledge information. Studies provide a mixed picture of how such information affects event-related potentials (ERPs). We systematically examined the effect of feature-based attention on ERP modulations to briefly presented faces of individuals associated with a crime. The tasks required participants ( N = 40 adults) to discriminate the orientation of lines overlaid onto the face, the age of the face, or emotional information associated with the face. Negative faces amplified the N170 ERP component during all tasks, whereas the early posterior negativity (EPN) and late positive potential (LPP) components were increased only when the emotional information was attended to. These findings suggest that during early configural analyses (N170), evaluative information potentiates face processing regardless of feature-based attention. During intermediate, only partially resource-dependent, processing stages (EPN) and late stages of elaborate stimulus processing (LPP), attention to the acquired emotional information is necessary for amplified processing of negatively evaluated faces.


2021 ◽  
Author(s):  
Arianna Schiano Lomoriello ◽  
Antonio Maffei ◽  
Sabrina Brigadoi ◽  
Paola Sessa

Simulation models of facial expressions suggest that posterior visual areas and brain areas underpinning sensorimotor simulations might interact to improve facial expression processing. According to these models, facial mimicry, a manifestation of sensorimotor simulation, may contribute to the visual processing of facial expressions by influencing early stages. The aim of this study was to assess whether and how sensorimotor simulation influences early stages of face processing, also investigating its relationship with alexithymic traits given that previous studies have suggested that individuals with high levels of alexithymic traits (vs. individuals with low levels of alexithymic traits) tend to use sensorimotor simulation to a lesser extent. We monitored P1 and N170 ERP components of the event-related potentials (ERP) in participants performing a fine discrimination task of facial expressions and animals, as a control condition. In half of the experiment, participants could freely use their facial mimicry whereas in the other half they had their facial mimicry blocked by a gel. Our results revealed that only individuals with lower compared to high alexithymic traits showed a larger modulation of the P1 amplitude as a function of the mimicry manipulation selectively for facial expressions (but not for animals), while we did not observe any modulation of the N170. Given the null results at the behavioural level, we interpreted the P1 modulation as compensative visual processing in individuals with low levels of alexithymia under conditions of interference on the sensorimotor processing, providing a preliminary evidence in favor of sensorimotor simulation models.


2008 ◽  
Vol 22 (1) ◽  
pp. 41-57 ◽  
Author(s):  
Anja Roye ◽  
Lea Höfel ◽  
Thomas Jacobsen

Temporal and brain topographic characteristics of the aesthetic judgment of male and female faces were investigated, using event-related potentials and reaction times. The evaluative aesthetic judgment of facial beauty (beautiful vs. not beautiful) was contrasted with a nonevaluative descriptive judgment of head shape (round vs. oval). Analysis showed longer reaction times in the descriptive than in the evaluative task, suggesting that the descriptive judgment demanded more cognitive effort and may entail greater uncertainty. Electrophysiologically, the evaluative judgment elicited a negativity (400 to 480 ms) for the judgment not beautiful, maximal over midline leads. A comparable deflection has been previously reported for evaluative judgments of graphic patterns. It was interpreted as an impression formation independent of the type of stimulus material, occurring when an aesthetic entity is judged intentionally. Besides this effect, which was independent of the gender of the face, the temporal characteristics of aesthetic evaluation differed depending on the gender of the face. We report a negativity for male faces only (280–440 ms) and a late positivity (520–1200 ms), which was stronger for female faces, both concerning not beautiful judgments. Thus, the evaluation of male and female facial beauty was processed in different time-windows. The descriptive judgment round elicited a larger posterior positivity compared with oval (320–620 ms). These results complement investigations of the architecture and time course of evaluative aesthetic and descriptive judgment processes, using faces as stimulus material.


2020 ◽  
Vol 15 (7) ◽  
pp. 765-774 ◽  
Author(s):  
Sebastian Schindler ◽  
Maximilian Bruchmann ◽  
Anna-Lena Steinweg ◽  
Robert Moeck ◽  
Thomas Straube

Abstract The processing of fearful facial expressions is prioritized by the human brain. This priority is maintained across various information processing stages as evident in early, intermediate and late components of event-related potentials (ERPs). However, emotional modulations are inconsistently reported for these different processing stages. In this pre-registered study, we investigated how feature-based attention differentially affects ERPs to fearful and neutral faces in 40 participants. The tasks required the participants to discriminate either the orientation of lines overlaid onto the face, the sex of the face or the face’s emotional expression, increasing attention to emotion-related features. We found main effects of emotion for the N170, early posterior negativity (EPN) and late positive potential (LPP). While N170 emotional modulations were task-independent, interactions of emotion and task were observed for the EPN and LPP. While EPN emotion effects were found in the sex and emotion tasks, the LPP emotion effect was mainly driven by the emotion task. This study shows that early responses to fearful faces are task-independent (N170) and likely based on low-level and configural information while during later processing stages, attention to the face (EPN) or—more specifically—to the face’s emotional expression (LPP) is crucial for reliable amplified processing of emotional faces.


2020 ◽  
Vol 10 (8) ◽  
pp. 537 ◽  
Author(s):  
David Schubring ◽  
Matthias Kraus ◽  
Christopher Stolz ◽  
Niklas Weiler ◽  
Daniel A. Keim ◽  
...  

The progress of technology has increased research on neuropsychological emotion and attention with virtual reality (VR). However, direct comparisons between conventional two-dimensional (2D) and VR stimulations are lacking. Thus, the present study compared electroencephalography (EEG) correlates of explicit task and implicit emotional attention between 2D and VR stimulation. Participants (n = 16) viewed angry and neutral faces with equal size and distance in both 2D and VR, while they were asked to count one of the two facial expressions. For the main effects of emotion (angry vs. neutral) and task (target vs. nontarget), established event related potentials (ERP), namely the late positive potential (LPP) and the target P300, were replicated. VR stimulation compared to 2D led to overall bigger ERPs but did not interact with emotion or task effects. In the frequency domain, alpha/beta-activity was larger in VR compared to 2D stimulation already in the baseline period. Of note, while alpha/beta event related desynchronization (ERD) for emotion and task conditions were seen in both VR and 2D stimulation, these effects were significantly stronger in VR than in 2D. These results suggest that enhanced immersion with the stimulus materials enabled by VR technology can potentiate induced brain oscillation effects to implicit emotion and explicit task effects.


2020 ◽  
Author(s):  
Gillian M Clark ◽  
Claire McNeel ◽  
Felicity J Bigelow ◽  
Peter Gregory Enticott

The investigation of emotional face processing has largely used faces devoid of context, and does not account for within-perceiver differences in empathy. The importance of context in face perception has become apparent in recent years. This study examined the interaction of the contextual factors of facial expression, knowledge of a person’s character, and within-perceiver empathy levels on face processing event-related potentials (ERPs). Forty-two adult participants learned background information about six individuals’ character. Three types of character were described, in which the character was depicted as deliberately causing harm to others, accidentally causing harm to others, or undertaking neutral actions. Subsequently, EEG was recorded while participants viewed the characters’ faces displaying neutral or emotional expressions. Participants’ empathy was assessed using the Empathy Quotient survey. Results showed a significant interaction of character type and empathy on the early posterior negativity (EPN) ERP component. These results suggested that for those with either low or high empathy, more attention was paid to the face stimuli, with more distinction between the different characters. In contrast, those in the middle range of empathy tended to produce smaller EPN with less distinction between character types. Findings highlight the importance of trait empathy in accounting for how faces in context are perceived.


Sign in / Sign up

Export Citation Format

Share Document