scholarly journals Encoding of Emotional Facial Expressions in Direct and Incidental Tasks: An Event-Related Potentials N200 Effect

2012 ◽  
Vol 16 (2) ◽  
pp. 92-109 ◽  
Author(s):  
Michela Balconi ◽  
Uberto Pozzoli
2021 ◽  
Vol 12 ◽  
Author(s):  
Yutong Liu ◽  
Huini Peng ◽  
Jianhui Wu ◽  
Hongxia Duan

Background: Individuals exposed to childhood maltreatment present with a deficiency in emotional processing in later life. Most studies have focused mainly on childhood physical or sexual abuse; however, childhood emotional abuse, a core issue underlying different forms of childhood maltreatment, has received relatively little attention. The current study explored whether childhood emotional abuse is related to the impaired processing of emotional facial expressions in healthy young men.Methods: The emotional facial processing was investigated in a classical gender discrimination task while the event-related potentials (ERPs) data were collected. Childhood emotional abuse was assessed by a Childhood Trauma Questionnaire (CTQ) among 60 healthy young men. The relationship between the score of emotional abuse and the behavioral and the ERP index of emotional facial expression (angry, disgust, and happy) were explored.Results: Participants with a higher score of childhood emotional abuse responded faster on the behavioral level and had a smaller P2 amplitude on the neural level when processing disgust faces compared to neutral faces.Discussion: Individuals with a higher level of childhood emotional abuse may quickly identify negative faces with less cognitive resources consumed, suggesting altered processing of emotional facial expressions in young men with a higher level of childhood emotional abuse.


2021 ◽  
Vol 15 ◽  
Author(s):  
Teresa Sollfrank ◽  
Oona Kohnen ◽  
Peter Hilfiker ◽  
Lorena C. Kegel ◽  
Hennric Jokeit ◽  
...  

This study aimed to examine whether the cortical processing of emotional faces is modulated by the computerization of face stimuli (”avatars”) in a group of 25 healthy participants. Subjects were passively viewing 128 static and dynamic facial expressions of female and male actors and their respective avatars in neutral or fearful conditions. Event-related potentials (ERPs), as well as alpha and theta event-related synchronization and desynchronization (ERD/ERS), were derived from the EEG that was recorded during the task. All ERP features, except for the very early N100, differed in their response to avatar and actor faces. Whereas the N170 showed differences only for the neutral avatar condition, later potentials (N300 and LPP) differed in both emotional conditions (neutral and fear) and the presented agents (actor and avatar). In addition, we found that the avatar faces elicited significantly stronger reactions than the actor face for theta and alpha oscillations. Especially theta EEG frequencies responded specifically to visual emotional stimulation and were revealed to be sensitive to the emotional content of the face, whereas alpha frequency was modulated by all the stimulus types. We can conclude that the computerized avatar faces affect both, ERP components and ERD/ERS and evoke neural effects that are different from the ones elicited by real faces. This was true, although the avatars were replicas of the human faces and contained similar characteristics in their expression.


2021 ◽  
pp. 1-10
Author(s):  
Lisa K. Chinn ◽  
Irina Ovchinnikova ◽  
Anastasia A. Sukmanova ◽  
Aleksandra O. Davydova ◽  
Elena L. Grigorenko

Abstract Millions of children worldwide are raised in institutionalized settings. Unfortunately, institutionalized rearing is often characterized by psychosocial deprivation, leading to difficulties in numerous social, emotional, physical, and cognitive skills. One such skill is the ability to recognize emotional facial expressions. Children with a history of institutional rearing tend to be worse at recognizing emotions in facial expressions than their peers, and this deficit likely affects social interactions. However, emotional information is also conveyed vocally, and neither prosodic information processing nor the cross-modal integration of facial and prosodic emotional expressions have been investigated in these children to date. We recorded electroencephalograms (EEG) while 47 children under institutionalized care (IC) (n = 24) or biological family care (BFC) (n = 23) viewed angry, happy, or neutral facial expressions while listening to pseudowords with angry, happy, or neutral prosody. The results indicate that 20- to 40-month-olds living in IC have event-related potentials (ERPs) over midfrontal brain regions that are less sensitive to incongruent facial and prosodic emotions relative to children under BFC, and that their brain responses to prosody are less lateralized. Children under IC also showed midfrontal ERP differences in processing of angry prosody, indicating that institutionalized rearing may specifically affect the processing of anger.


2005 ◽  
Vol 100 (1) ◽  
pp. 129-134 ◽  
Author(s):  
Michela Balconi

The present research compared the semantic information processing of linguistic stimuli with semantic elaboration of nonlinguistic facial stimuli. To explore brain potentials (ERPs, event-related potentials) related to decoding facial expressions and the effect of semantic valence of the stimulus, we analyzed data for 20 normal subjects ( M age = 23.6 yr., SD = 0.2). Faces with three basic emotional expressions (fear, happiness, and sadness from the 1976 Ekman and Friesen database), with three semantically anomalous expressions (with respect to their emotional content), and the neutral stimuli (face without an emotional content) were presented in a random order. Differences in peak amplitude of ERP were observed later for anomalous expressions compared with congruous expressions. In fact, the results demonstrated that the emotional anomalous faces elicited a higher negative peak at about 360 msec., distributed mainly over the posterior sites. The observed electrophysiological activity may represent specific cognitive processing underlying the comprehension of facial expressions in detection of semantic anomaly. The evidence is in favour of comparability of this negative deflection with the N400 ERP effect elicited by linguistic anomalies.


Author(s):  
Michela Balconi

Neuropsychological studies have underlined the significant presence of distinct brain correlates deputed to analyze facial expression of emotion. It was observed that some cerebral circuits were considered as specific for emotional face comprehension as a function of conscious vs. unconscious processing of emotional information. Moreover, the emotional content of faces (i.e. positive vs. negative; more or less arousing) may have an effect in activating specific cortical networks. Between the others, recent studies have explained the contribution of hemispheres in comprehending face, as a function of type of emotions (mainly related to the distinction positive vs. negative) and of specific tasks (comprehending vs. producing facial expressions). Specifically, ERPs (event-related potentials) analysis overview is proposed in order to comprehend how face may be processed by an observer and how he can make face a meaningful construct even in absence of awareness. Finally, brain oscillations is considered in order to explain the synchronization of neural populations in response to emotional faces when a conscious vs. unconscious processing is activated.


Author(s):  
Shozo Tobimatsu

There are two major parallel pathways in humans: the parvocellular (P) and magnocellular (M) pathways. The former has excellent spatial resolution with color selectivity, while the latter shows excellent temporal resolution with high contrast sensitivity. Visual stimuli should be tailored to answer specific clinical and/or research questions. This chapter examines the neural mechanisms of face perception using event-related potentials (ERPs). Face stimuli of different spatial frequencies were used to investigate how low-spatial-frequency (LSF) and high-spatial-frequency (HSF) components of the face contribute to the identification and recognition of the face and facial expressions. The P100 component in the occipital area (Oz), the N170 in the posterior temporal region (T5/T6) and late components peaking at 270-390 ms (T5/T6) were analyzed. LSF enhanced P100, while N170 was augmented by HSF irrespective of facial expressions. This suggested that LSF is important for global processing of facial expressions, whereas HSF handles featural processing. There were significant amplitude differences between positive and negative LSF facial expressions in the early time windows of 270-310 ms. Subsequently, the amplitudes among negative HSF facial expressions differed significantly in the later time windows of 330–390 ms. Discrimination between positive and negative facial expressions precedes discrimination among different negative expressions in a sequential manner based on parallel visual channels. Interestingly, patients with schizophrenia showed decreased spatial frequency sensitivities for face processing. Taken together, the spatially filtered face images are useful for exploring face perception and recognition.


Sign in / Sign up

Export Citation Format

Share Document