scholarly journals Recognition Characteristics of Facial and Bodily Expressions: Evidence From ERPs

2021 ◽  
Vol 12 ◽  
Author(s):  
Xiaoxiao Li

In the natural environment, facial and bodily expressions influence each other. Previous research has shown that bodily expressions significantly influence the perception of facial expressions. However, little is known about the cognitive processing of facial and bodily emotional expressions and its temporal characteristics. Therefore, this study presented facial and bodily expressions, both separately and together, to examine the electrophysiological mechanism of emotional recognition using event-related potential (ERP). Participants assessed the emotions of facial and bodily expressions that varied by valence (positive/negative) and consistency (matching/non-matching emotions). The results showed that bodily expressions induced a more positive P1 component and a shortened latency, whereas facial expressions triggered a more negative N170 and prolonged latency. Among N2 and P3, N2 was more sensitive to inconsistent emotional information and P3 was more sensitive to consistent emotional information. The cognitive processing of facial and bodily expressions had distinctive integrating features, with the interaction occurring in the early stage (N170). The results of the study highlight the importance of facial and bodily expressions in the cognitive processing of emotion recognition.

Author(s):  
Izabela Krejtz ◽  
Krzysztof Krejtz ◽  
Katarzyna Wisiecka ◽  
Marta Abramczyk ◽  
Michał Olszanowski ◽  
...  

Abstract The enhancement hypothesis suggests that deaf individuals are more vigilant to visual emotional cues than hearing individuals. The present eye-tracking study examined ambient–focal visual attention when encoding affect from dynamically changing emotional facial expressions. Deaf (n = 17) and hearing (n = 17) individuals watched emotional facial expressions that in 10-s animations morphed from a neutral expression to one of happiness, sadness, or anger. The task was to recognize emotion as quickly as possible. Deaf participants tended to be faster than hearing participants in affect recognition, but the groups did not differ in accuracy. In general, happy faces were more accurately and more quickly recognized than faces expressing anger or sadness. Both groups demonstrated longer average fixation duration when recognizing happiness in comparison to anger and sadness. Deaf individuals directed their first fixations less often to the mouth region than the hearing group. During the last stages of emotion recognition, deaf participants exhibited more focal viewing of happy faces than negative faces. This pattern was not observed among hearing individuals. The analysis of visual gaze dynamics, switching between ambient and focal attention, was useful in studying the depth of cognitive processing of emotional information among deaf and hearing individuals.


2021 ◽  
Vol 12 ◽  
Author(s):  
Shu Zhang ◽  
Xinge Liu ◽  
Xuan Yang ◽  
Yezhi Shu ◽  
Niqi Liu ◽  
...  

Cartoon faces are widely used in social media, animation production, and social robots because of their attractive ability to convey different emotional information. Despite their popular applications, the mechanisms of recognizing emotional expressions in cartoon faces are still unclear. Therefore, three experiments were conducted in this study to systematically explore a recognition process for emotional cartoon expressions (happy, sad, and neutral) and to examine the influence of key facial features (mouth, eyes, and eyebrows) on emotion recognition. Across the experiments, three presentation conditions were employed: (1) a full face; (2) individual feature only (with two other features concealed); and (3) one feature concealed with two other features presented. The cartoon face images used in this study were converted from a set of real faces acted by Chinese posers, and the observers were Chinese. The results show that happy cartoon expressions were recognized more accurately than neutral and sad expressions, which was consistent with the happiness recognition advantage revealed in real face studies. Compared with real facial expressions, sad cartoon expressions were perceived as sadder, and happy cartoon expressions were perceived as less happy, regardless of whether full-face or single facial features were viewed. For cartoon faces, the mouth was demonstrated to be a feature that is sufficient and necessary for the recognition of happiness, and the eyebrows were sufficient and necessary for the recognition of sadness. This study helps to clarify the perception mechanism underlying emotion recognition in cartoon faces and sheds some light on directions for future research on intelligent human-computer interactions.


2019 ◽  
Author(s):  
Jacob Israelashvlili

Previous research has found that individuals vary greatly in emotion differentiation, that is, the extent to which they distinguish between different emotions when reporting on their own feelings. Building on previous work that has shown that emotion differentiation is associated with individual differences in intrapersonal functions, the current study asks whether emotion differentiation is also related to interpersonal skills. Specifically, we examined whether individuals who are high in emotion differentiation would be more accurate in recognizing others’ emotional expressions. We report two studies in which we used an established paradigm tapping negative emotion differentiation and several emotion recognition tasks. In Study 1 (N = 363), we found that individuals high in emotion differentiation were more accurate in recognizing others’ emotional facial expressions. Study 2 (N = 217), replicated this finding using emotion recognition tasks with varying amounts of emotional information. These findings suggest that the knowledge we use to understand our own emotional experience also helps us understand the emotions of others.


2014 ◽  
Vol 2014 ◽  
pp. 1-11 ◽  
Author(s):  
Dina Tell ◽  
Denise Davidson ◽  
Linda A. Camras

Eye gaze direction and expression intensity effects on emotion recognition in children with autism disorder and typically developing children were investigated. Children with autism disorder and typically developing children identified happy and angry expressions equally well. Children with autism disorder, however, were less accurate in identifying fear expressions across intensities and eye gaze directions. Children with autism disorder rated expressions with direct eyes, and 50% expressions, as more intense than typically developing children. A trend was also found for sad expressions, as children with autism disorder were less accurate in recognizing sadness at 100% intensity with direct eyes than typically developing children. Although the present research showed that children with autism disorder are sensitive to eye gaze direction, impairments in the recognition of fear, and possibly sadness, exist. Furthermore, children with autism disorder and typically developing children perceive the intensity of emotional expressions differently.


2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Chunli Chen ◽  
Huan Yang ◽  
Yasong Du ◽  
Guangzhi Zhai ◽  
Hesheng Xiong ◽  
...  

Attention deficit hyperactivity disorder (ADHD) is one of the most common neurodevelopmental brain disorders in childhood. Despite extensive researches, the neurobiological mechanism underlying ADHD is still left unveiled. Since the deficit functions, such as attention, have been demonstrated in ADHD, in our present study, based on the oddball P3 task, the corresponding electroencephalogram (EEG) of both healthy controls (HCs) and ADHD children was first collected. And we then not only focused on the event-related potential (ERP) evoked during tasks but also investigated related brain networks. Although an insignificant difference in behavior was found between the HCs and ADHD children, significant electrophysiological differences were found in both ERPs and brain networks. In detail, the dysfunctional attention occurred during the early stage of the designed task; as compared to HCs, the reduced P2 and N2 amplitudes in ADHD children were found, and the atypical information interaction might further underpin such a deficit. On the one hand, when investigating the cortical activity, HCs recruited much stronger brain activity mainly in the temporal and frontal regions, compared to ADHD children; on the other hand, the brain network showed atypical enhanced long-range connectivity between the frontal and occipital lobes but attenuated connectivity among frontal, parietal, and temporal lobes in ADHD children. We hope that the findings in this study may be instructive for the understanding of cognitive processing in children with ADHD.


2018 ◽  
Author(s):  
Damien Dupré ◽  
Nicole Andelic ◽  
Anna Zajac ◽  
Gawain Morrison ◽  
Gary John McKeown

Sharing personal information is an important way of communicating on social media. Among the information possibly shared, new sensors and tools allow people to share emotion information via facial emotion recognition. This paper questions whether people are prepared to share personal information such as their own emotion on social media. In the current study we examined how factors such as felt emotion, motivation for sharing on social media as well as personality affected participants’ willingness to share self-reported emotion or facial expression online. By carrying out a GLMM analysis, this study found that participants’ willingness to share self-reported emotion and facial expressions was influenced by their personality traits and the motivation for sharing their emotion information that they were given. From our results we can conclude that the estimated level of privacy for certain emotional information, such as facial expression, is influenced by the motivation for sharing the information online.


2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Hanna Drimalla ◽  
Irina Baskow ◽  
Behnoush Behnia ◽  
Stefan Roepke ◽  
Isabel Dziobek

Abstract Background Imitation of facial expressions plays an important role in social functioning. However, little is known about the quality of facial imitation in individuals with autism and its relationship with defining difficulties in emotion recognition. Methods We investigated imitation and recognition of facial expressions in 37 individuals with autism spectrum conditions and 43 neurotypical controls. Using a novel computer-based face analysis, we measured instructed imitation of facial emotional expressions and related it to emotion recognition abilities. Results Individuals with autism imitated facial expressions if instructed to do so, but their imitation was both slower and less precise than that of neurotypical individuals. In both groups, a more precise imitation scaled positively with participants’ accuracy of emotion recognition. Limitations Given the study’s focus on adults with autism without intellectual impairment, it is unclear whether the results generalize to children with autism or individuals with intellectual disability. Further, the new automated facial analysis, despite being less intrusive than electromyography, might be less sensitive. Conclusions Group differences in emotion recognition, imitation and their interrelationships highlight potential for treatment of social interaction problems in individuals with autism.


2021 ◽  
Author(s):  
Evrim Gulbetekin

Abstract This investigation used three experiments to test the effect of mask use and other-race effect (ORE) on face perception in three contexts: (a) face recognition, (b) recognition of facial expressions, and (c) social distance. The first, which involved a matching-to-sample paradigm, tested Caucasian subjects with either masked or unmasked faces using Caucasian and Asian samples. The participants exhibited the best performance in recognizing an unmasked face condition and the poorest when asked to recognize a masked face that they had seen earlier without a mask. Accuracy was also poorer for Asian faces than Caucasian faces. The second experiment presented Asian or Caucasian faces having different emotional expressions, with and without masks. The results for this task, which involved identifying which emotional expression the participants had seen on the presented face, indicated that emotion recognition performance decreased for faces portrayed with masks. The emotional expressions ranged from the most accurately to least accurately recognized as follows: happy, neutral, disgusted, and fearful. Emotion recognition performance was poorer for Asian stimuli compared to Caucasian. Experiment 3 used the same participants and stimuli and asked participants to indicate the social distance they would prefer to observe with each pictured person. The participants preferred a wider social distance with unmasked faces compared to masked faces. Social distance also varied by the portrayed emotion: ranging from farther to closer as follows: disgusted, fearful, neutral, and happy. Race was also a factor; participants preferred wider social distance for Asian compared to Caucasian faces. Altogether, our findings indicated that during the COVID-19 pandemic face perception and social distance were affected by mask use, ORE.


F1000Research ◽  
2017 ◽  
Vol 6 ◽  
pp. 853 ◽  
Author(s):  
Madoka Yamazaki ◽  
Kyoko Tamura

Background: Several studies have investigated the relationship between behavioral changes and the menstrual cycle in female subjects at a reproductive age. The present study investigated the relationship between the menstrual cycle and emotional face recognition by measuring the N170 component of ERPs. Methods: We measured N170 of twelve women in both follicular phase and late luteal phase who were presented with human facial expressions as stimuli (happy and angry). Results: In the follicular phase, participants showed a significantly larger response to happy male facial expressions. In the late luteal phase, participants had longer reaction times to all emotional stimuli, and a significantly reduced response to happy faces, especially happy male facial expressions (P<0.001). Conclusions: Our findings suggest that the menstrual cycle modulates early visual cognitive processing, and highlight the importance of considering the menstrual cycle phase in studies that investigate emotion and cognition.


Sign in / Sign up

Export Citation Format

Share Document