scholarly journals The effect of empathy and context on face-processing ERPs

2020 ◽  
Author(s):  
Gillian M Clark ◽  
Claire McNeel ◽  
Felicity J Bigelow ◽  
Peter Gregory Enticott

The investigation of emotional face processing has largely used faces devoid of context, and does not account for within-perceiver differences in empathy. The importance of context in face perception has become apparent in recent years. This study examined the interaction of the contextual factors of facial expression, knowledge of a person’s character, and within-perceiver empathy levels on face processing event-related potentials (ERPs). Forty-two adult participants learned background information about six individuals’ character. Three types of character were described, in which the character was depicted as deliberately causing harm to others, accidentally causing harm to others, or undertaking neutral actions. Subsequently, EEG was recorded while participants viewed the characters’ faces displaying neutral or emotional expressions. Participants’ empathy was assessed using the Empathy Quotient survey. Results showed a significant interaction of character type and empathy on the early posterior negativity (EPN) ERP component. These results suggested that for those with either low or high empathy, more attention was paid to the face stimuli, with more distinction between the different characters. In contrast, those in the middle range of empathy tended to produce smaller EPN with less distinction between character types. Findings highlight the importance of trait empathy in accounting for how faces in context are perceived.

2021 ◽  
pp. 095679762199666
Author(s):  
Sebastian Schindler ◽  
Maximilian Bruchmann ◽  
Claudia Krasowski ◽  
Robert Moeck ◽  
Thomas Straube

Our brains rapidly respond to human faces and can differentiate between many identities, retrieving rich semantic emotional-knowledge information. Studies provide a mixed picture of how such information affects event-related potentials (ERPs). We systematically examined the effect of feature-based attention on ERP modulations to briefly presented faces of individuals associated with a crime. The tasks required participants ( N = 40 adults) to discriminate the orientation of lines overlaid onto the face, the age of the face, or emotional information associated with the face. Negative faces amplified the N170 ERP component during all tasks, whereas the early posterior negativity (EPN) and late positive potential (LPP) components were increased only when the emotional information was attended to. These findings suggest that during early configural analyses (N170), evaluative information potentiates face processing regardless of feature-based attention. During intermediate, only partially resource-dependent, processing stages (EPN) and late stages of elaborate stimulus processing (LPP), attention to the acquired emotional information is necessary for amplified processing of negatively evaluated faces.


2019 ◽  
Vol 9 (5) ◽  
pp. 116 ◽  
Author(s):  
Luis Aguado ◽  
Karisa Parkington ◽  
Teresa Dieguez-Risco ◽  
José Hinojosa ◽  
Roxane Itier

Faces showing expressions of happiness or anger were presented together with sentences that described happiness-inducing or anger-inducing situations. Two main variables were manipulated: (i) congruency between contexts and expressions (congruent/incongruent) and (ii) the task assigned to the participant, discriminating the emotion shown by the target face (emotion task) or judging whether the expression shown by the face was congruent or not with the context (congruency task). Behavioral and electrophysiological results (event-related potentials (ERP)) showed that processing facial expressions was jointly influenced by congruency and task demands. ERP results revealed task effects at frontal sites, with larger positive amplitudes between 250–450 ms in the congruency task, reflecting the higher cognitive effort required by this task. Effects of congruency appeared at latencies and locations corresponding to the early posterior negativity (EPN) and late positive potential (LPP) components that have previously been found to be sensitive to emotion and affective congruency. The magnitude and spatial distribution of the congruency effects varied depending on the task and the target expression. These results are discussed in terms of the modulatory role of context on facial expression processing and the different mechanisms underlying the processing of expressions of positive and negative emotions.


2015 ◽  
Vol 77 (2) ◽  
pp. 116-126 ◽  
Author(s):  
Amanda McCleery ◽  
Junghee Lee ◽  
Aditi Joshi ◽  
Jonathan K. Wynn ◽  
Gerhard S. Hellemann ◽  
...  

2020 ◽  
Vol 15 (7) ◽  
pp. 765-774 ◽  
Author(s):  
Sebastian Schindler ◽  
Maximilian Bruchmann ◽  
Anna-Lena Steinweg ◽  
Robert Moeck ◽  
Thomas Straube

Abstract The processing of fearful facial expressions is prioritized by the human brain. This priority is maintained across various information processing stages as evident in early, intermediate and late components of event-related potentials (ERPs). However, emotional modulations are inconsistently reported for these different processing stages. In this pre-registered study, we investigated how feature-based attention differentially affects ERPs to fearful and neutral faces in 40 participants. The tasks required the participants to discriminate either the orientation of lines overlaid onto the face, the sex of the face or the face’s emotional expression, increasing attention to emotion-related features. We found main effects of emotion for the N170, early posterior negativity (EPN) and late positive potential (LPP). While N170 emotional modulations were task-independent, interactions of emotion and task were observed for the EPN and LPP. While EPN emotion effects were found in the sex and emotion tasks, the LPP emotion effect was mainly driven by the emotion task. This study shows that early responses to fearful faces are task-independent (N170) and likely based on low-level and configural information while during later processing stages, attention to the face (EPN) or—more specifically—to the face’s emotional expression (LPP) is crucial for reliable amplified processing of emotional faces.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Francesca Pesciarelli ◽  
Irene Leo ◽  
Luana Serafini

AbstractThe study aimed to examine the neural mechanisms underlying implicit other-race face processing by the use of the masked and unmasked priming manipulation. Two types of prime-target pairs were presented while recording Event-related potentials (ERPs): Same face pairs (prime-target were identical faces), and Different face pairs (prime-target were different faces). Prime-target pairs were half Asian (other-race) and half Caucasian (own-race) faces. The face stimuli on each pair were of the same gender and race. Participants (all Caucasians) had to decide whether the target was a male or a female face (gender task). The prime face could be unmasked or masked. On the behavioral side, our findings showed a race effect, that is slower reaction times (RTs) for other-race than own-race face stimuli, regardless of masking. On the ERPs side, our data showed a race effect across all components analyzed (P100, N100, N200, P300), under both the unmasked and masked manipulations. Besides, we found, in the unmasked condition, a priming effect as a function of race on the N100, N200, and P300 components; but, interestingly, in the masked condition, only on the P300. Overall, our findings provide evidence that race information is available very early in the brain and can strongly activate and influence people’s behaviors even without conscious awareness.


2021 ◽  
Vol 15 ◽  
Author(s):  
Teresa Sollfrank ◽  
Oona Kohnen ◽  
Peter Hilfiker ◽  
Lorena C. Kegel ◽  
Hennric Jokeit ◽  
...  

This study aimed to examine whether the cortical processing of emotional faces is modulated by the computerization of face stimuli (”avatars”) in a group of 25 healthy participants. Subjects were passively viewing 128 static and dynamic facial expressions of female and male actors and their respective avatars in neutral or fearful conditions. Event-related potentials (ERPs), as well as alpha and theta event-related synchronization and desynchronization (ERD/ERS), were derived from the EEG that was recorded during the task. All ERP features, except for the very early N100, differed in their response to avatar and actor faces. Whereas the N170 showed differences only for the neutral avatar condition, later potentials (N300 and LPP) differed in both emotional conditions (neutral and fear) and the presented agents (actor and avatar). In addition, we found that the avatar faces elicited significantly stronger reactions than the actor face for theta and alpha oscillations. Especially theta EEG frequencies responded specifically to visual emotional stimulation and were revealed to be sensitive to the emotional content of the face, whereas alpha frequency was modulated by all the stimulus types. We can conclude that the computerized avatar faces affect both, ERP components and ERD/ERS and evoke neural effects that are different from the ones elicited by real faces. This was true, although the avatars were replicas of the human faces and contained similar characteristics in their expression.


2007 ◽  
Vol 19 (3) ◽  
pp. 543-555 ◽  
Author(s):  
Bruno Rossion ◽  
Daniel Collins ◽  
Valérie Goffaux ◽  
Tim Curran

The degree of commonality between the perceptual mechanisms involved in processing faces and objects of expertise is intensely debated. To clarify this issue, we recorded occipito-temporal event-related potentials in response to faces when concurrently processing visual objects of expertise. In car experts fixating pictures of cars, we observed a large decrease of an evoked potential elicited by face stimuli between 130 and 200 msec, the N170. This sensory suppression was much lower when the car and face stimuli were separated by a 200-msec blank interval. With and without this delay, there was a strong correlation between the face-evoked N170 amplitude decrease and the subject's level of car expertise as measured in an independent behavioral task. Together, these results show that neural representations of faces and nonface objects in a domain of expertise compete for visual processes in the occipito-temporal cortex as early as 130–200 msec following stimulus onset.


Author(s):  
Galit Yovel

As social primates, one of the most important cognitive tasks we conduct, dozens of times a day, is to look at a face and extract the person's identity. During the last decade, the neural basis of face processing has been extensively investigated in humans with event-related potential (ERP) and functional MRI (fMRI). These two methods provide complementary information about the temporal and spatial aspects of the neural response, with ERPs allowing high temporal resolution of milliseconds but low spatial resolution of the neural generator and fMRI displaying a slow hemodynamic response but better spatial localization of the activated regions. Despite the extensive fMRI and ERP research of faces, only a few studies have assessed the relationship between the two methods and no study to date have collected simultaneous ERP and fMRI responses to face stimuli. In the current paper we will try to assess the spatial and temporal aspects of the neural response to faces by simultaneously collecting functional MRI and event-related potentials (ERP) to face stimuli. Our goals are twofold: 1) ERP and fMRI show a robust selective response to faces. In particular, two well-established face-specific phenomena, the RH superiority and the inversion effect are robustly found with both ERP and fMRI. Despite the extensive research of these effects with ERP and fMRI, it is still unknown to what extent their spatial (fMRI) and temporal (ERP) aspects are associated. In Study 1 we will employ an individual differences approach, to assess the relationship between these ERP and fMRI face-specific responses. 2) Face processing involves several stages starting from structural encoding of the face image through identity processing to storage for later retrieval. This representation undergoes several manipulations that take place at different time points and in different brain regions before the final percept is generated. By simultaneously recording ERP and fMRI we hope to gain a more comprehensive understanding of the timecourse that different brain areas participate in the generation of the face representation.


Author(s):  
Johanna Kissler ◽  
Janine Strehlow

AbstractLanguage can serve to constrain cognitive and emotional representations. Here, we investigate to what extent linguistic emotional information alters processing of faces with neutral expressions. Previous studies have shown that cortical processing of emotional faces differs from that of neutral faces. Electroencephalography (EEG) has revealed emotion effects for early and late event-related potentials (ERPs) such as the N1, the Early Posterior Negativity (EPN) and the Late Positive Potential (LPP). In order to study the effect of language information on face processing, 30 negative and 30 neutral descriptive phrases were presented, each followed by a neutral expression face. Participants were instructed to remember the association. We analyzed the immediate effect of information type on face processing during encoding as well as delayed effects during subsequent recognition. During encoding, faces following negative language information elicited a larger left frontal positivity between 500–700 ms after stimulus onset. During recognition, a left centro-parietal LPP was likewise increased for faces previously associated with a negative description. In addition, the parietal old/new effect was significantly increased for faces with negative information compared to new ones, while no significant effect was observed for faces with neutral information. No information effects on early perceptual ERPs (N1, EPN) were found. Reaction times (RTs) for source memory decisions (negative versus neutral) were significantly shorter for faces with negative versus neutral information. In sum, ERP results indicate that emotional significance can be linguistically induced in faces on a cortical level and, at least in an explicit memory task, this information modulates later stages of face processing and memory. Implications for cognitive effects of public media design are discussed.


Sign in / Sign up

Export Citation Format

Share Document