scholarly journals Are event-related potentials to dynamic facial expressions of emotion related to individual differences in the accuracy of processing facial expressions and identity?

2017 ◽  
Vol 17 (2) ◽  
pp. 364-380 ◽  
Author(s):  
Guillermo Recio ◽  
Oliver Wilhelm ◽  
Werner Sommer ◽  
Andrea Hildebrandt
2021 ◽  
Author(s):  
Chris L. Porter ◽  
Courtney A. Evans Stout ◽  
Peter Joseph Reschke ◽  
Larry J Nelson ◽  
Daniel C. Hyde

The ability to decode and accurately identify information from facial emotions may advantage young children socially. This capacity to decode emotional information may likewise be influenced by individual differences in children’s temperament. This study investigated whether sensory reactivity and perceptual awareness, two dimensions of temperament, as well as children’s ability to accurately label emotions relates to the neural processing of emotional content in faces. Event related potentials (ERPs) of 4 to 6 year-old children (N = 119) were elicited from static displays of anger, happy, fearful, sad, and neutral emotion faces. Children, as a group, exhibited differential early (N290) and mid-latency (P400) event-related potentials (ERPs) in response to facial expressions of emotion. Individual differences in children’s sensory reactivity were associated with enhanced P400 amplitudes to neutral, sad, and fearful faces. In a separate task, children were asked to provide an emotional label for the same images. Interestingly, children less accurately labeled the same neutral, sad, and fearful faces, suggesting that, contrary to previous work showing enhanced attentional processing to threatening cues (i.e., fear), children higher in sensory reactivity may deploy more attentional resources when decoding ambiguous emotional cues.


2018 ◽  
Author(s):  
Wiebke Hammerschmidt ◽  
Louisa Kulke ◽  
Christina BrÖring ◽  
Annekathrin Schacht

In comparison to neutral faces, facial expressions of emotion are known to gain attentional prioritization, mainly demonstrated by means of event-related potentials (ERPs). Recent evidence indicated that such a preferential processing can also be elicited by neutral faces when associated with increased motivational salience via reward. It remains, however, an open question whether impacts of inherent emotional salience and associated motivational salience might be integrated. To this aim, expressions and outcomes were orthogonally combined. Participants (N=42) learned to explicitly categorize happy and neutral faces as either reward- or zero-outcome-related via an associative learning paradigm. ERP components (P1, N170, EPN, and LPC) were measured throughout the experiment, and separately analyzed before (learning phase) and after (consolidation phase) reaching a pre-defined learning criterion. Happy facial expressions boosted early processing stages, as reflected in enhanced amplitudes of the N170 and EPN, both during learning and consolidation. In contrast, effects of monetary reward became evident only after successful learning and in form of enlarged amplitudes of the LPC, a component linked to higher-order evaluations. Interactions between expressions and associated outcome were absent in all ERP components of interest. The present study provides novel evidence that acquired salience impacts stimulus processing but independent of the effects driven by happy facial expressions.


2021 ◽  
Vol 15 ◽  
Author(s):  
Teresa Sollfrank ◽  
Oona Kohnen ◽  
Peter Hilfiker ◽  
Lorena C. Kegel ◽  
Hennric Jokeit ◽  
...  

This study aimed to examine whether the cortical processing of emotional faces is modulated by the computerization of face stimuli (”avatars”) in a group of 25 healthy participants. Subjects were passively viewing 128 static and dynamic facial expressions of female and male actors and their respective avatars in neutral or fearful conditions. Event-related potentials (ERPs), as well as alpha and theta event-related synchronization and desynchronization (ERD/ERS), were derived from the EEG that was recorded during the task. All ERP features, except for the very early N100, differed in their response to avatar and actor faces. Whereas the N170 showed differences only for the neutral avatar condition, later potentials (N300 and LPP) differed in both emotional conditions (neutral and fear) and the presented agents (actor and avatar). In addition, we found that the avatar faces elicited significantly stronger reactions than the actor face for theta and alpha oscillations. Especially theta EEG frequencies responded specifically to visual emotional stimulation and were revealed to be sensitive to the emotional content of the face, whereas alpha frequency was modulated by all the stimulus types. We can conclude that the computerized avatar faces affect both, ERP components and ERD/ERS and evoke neural effects that are different from the ones elicited by real faces. This was true, although the avatars were replicas of the human faces and contained similar characteristics in their expression.


2013 ◽  
Author(s):  
Evonne M. Edwards ◽  
John K. Williams ◽  
Todd W. Hall ◽  
Keith J. Edwards

2017 ◽  
Author(s):  
Emily S Nichols ◽  
Marc F Joanisse

We investigated the extent to which second-language (L2) learning is influenced by the similarity of grammatical features in one’s first language (L1). We used event-related potentials to identify neural signatures of a novel grammatical rule - grammatical gender - in L1 English speakers. Of interest was whether individual differences in L2 proficiency and age of acquisition (AoA) influenced these effects. L2 and native speakers of French read French sentences that were grammatically correct, or contained either a grammatical gender or word order violation. Proficiency and AoA predicted Left Anterior Negativity amplitude, with structure violations driving the proficiency effect and gender violations driving the AoA effect. Proficiency, group, and AoA predicted P600 amplitude for gender violations but not structure violations. Different effects of grammatical gender and structure violations indicate that L2 speakers engage novel grammatical processes differently from L1 speakers and that this varies appreciably based on both AoA and proficiency.


2005 ◽  
Vol 100 (1) ◽  
pp. 129-134 ◽  
Author(s):  
Michela Balconi

The present research compared the semantic information processing of linguistic stimuli with semantic elaboration of nonlinguistic facial stimuli. To explore brain potentials (ERPs, event-related potentials) related to decoding facial expressions and the effect of semantic valence of the stimulus, we analyzed data for 20 normal subjects ( M age = 23.6 yr., SD = 0.2). Faces with three basic emotional expressions (fear, happiness, and sadness from the 1976 Ekman and Friesen database), with three semantically anomalous expressions (with respect to their emotional content), and the neutral stimuli (face without an emotional content) were presented in a random order. Differences in peak amplitude of ERP were observed later for anomalous expressions compared with congruous expressions. In fact, the results demonstrated that the emotional anomalous faces elicited a higher negative peak at about 360 msec., distributed mainly over the posterior sites. The observed electrophysiological activity may represent specific cognitive processing underlying the comprehension of facial expressions in detection of semantic anomaly. The evidence is in favour of comparability of this negative deflection with the N400 ERP effect elicited by linguistic anomalies.


Sign in / Sign up

Export Citation Format

Share Document