scholarly journals Imitation and recognition of facial emotions in autism: a computer vision approach

2021 ◽  
Vol 12 (1) ◽  
Author(s):  
Hanna Drimalla ◽  
Irina Baskow ◽  
Behnoush Behnia ◽  
Stefan Roepke ◽  
Isabel Dziobek

Abstract Background Imitation of facial expressions plays an important role in social functioning. However, little is known about the quality of facial imitation in individuals with autism and its relationship with defining difficulties in emotion recognition. Methods We investigated imitation and recognition of facial expressions in 37 individuals with autism spectrum conditions and 43 neurotypical controls. Using a novel computer-based face analysis, we measured instructed imitation of facial emotional expressions and related it to emotion recognition abilities. Results Individuals with autism imitated facial expressions if instructed to do so, but their imitation was both slower and less precise than that of neurotypical individuals. In both groups, a more precise imitation scaled positively with participants’ accuracy of emotion recognition. Limitations Given the study’s focus on adults with autism without intellectual impairment, it is unclear whether the results generalize to children with autism or individuals with intellectual disability. Further, the new automated facial analysis, despite being less intrusive than electromyography, might be less sensitive. Conclusions Group differences in emotion recognition, imitation and their interrelationships highlight potential for treatment of social interaction problems in individuals with autism.

2021 ◽  
Author(s):  
Evrim Gulbetekin

Abstract This investigation used three experiments to test the effect of mask use and other-race effect (ORE) on face perception in three contexts: (a) face recognition, (b) recognition of facial expressions, and (c) social distance. The first, which involved a matching-to-sample paradigm, tested Caucasian subjects with either masked or unmasked faces using Caucasian and Asian samples. The participants exhibited the best performance in recognizing an unmasked face condition and the poorest when asked to recognize a masked face that they had seen earlier without a mask. Accuracy was also poorer for Asian faces than Caucasian faces. The second experiment presented Asian or Caucasian faces having different emotional expressions, with and without masks. The results for this task, which involved identifying which emotional expression the participants had seen on the presented face, indicated that emotion recognition performance decreased for faces portrayed with masks. The emotional expressions ranged from the most accurately to least accurately recognized as follows: happy, neutral, disgusted, and fearful. Emotion recognition performance was poorer for Asian stimuli compared to Caucasian. Experiment 3 used the same participants and stimuli and asked participants to indicate the social distance they would prefer to observe with each pictured person. The participants preferred a wider social distance with unmasked faces compared to masked faces. Social distance also varied by the portrayed emotion: ranging from farther to closer as follows: disgusted, fearful, neutral, and happy. Race was also a factor; participants preferred wider social distance for Asian compared to Caucasian faces. Altogether, our findings indicated that during the COVID-19 pandemic face perception and social distance were affected by mask use, ORE.


2018 ◽  
Author(s):  
Helio Clemente Cuve ◽  
Yu Gao ◽  
Akiko Fuse

A systematic review was conducted for studies exploring the link between gaze patterns, autonomic arousal and emotion recognition deficits (ERD) in young adults with Autism Spectrum Conditions (ASC) in the context of the eye-avoidance/hyperarousal and the orientation/hypoarousal hypotheses. These hypotheses suggest that ERD in ASC can be explained by either exacerbated physiological arousal to eye-contact interfering with emotion recognition, or blunted arousal not engaging the necessary attention and awareness mechanisms to process emotionally salient cues, respectively. Most studies have suggested that individuals with ASC display an overall reduced attention to the eyes, however, this was not always associated with ERD, and some studies also reported ERD with no evidence of atypical gaze patterns. The evidence from psychophysiological studies is also mixed. While some studies supported that individuals with ASC are hypoaroused during emotion processing, others reported hyperarousal or even partially supported both. Overall, these results suggest that the current autonomic arousal and gaze hypotheses cannot fully account for ERD in ASC. A new integrative model is proposed, suggesting a two-pathway mechanism, in which avoidance and orientation processes might independently lead to ERD in ASC. Current methodological limitations, the influence of alexithymia, and implications are discussed.


2007 ◽  
Vol 18 (1) ◽  
pp. 31-36 ◽  
Author(s):  
Roy P. C. Kessels ◽  
Lotte Gerritsen ◽  
Barbara Montagne ◽  
Nibal Ackl ◽  
Janine Diehl ◽  
...  

Behavioural problems are a key feature of frontotemporal lobar degeneration (FTLD). Also, FTLD patients show impairments in emotion processing. Specifically, the perception of negative emotional facial expressions is affected. Generally, however, negative emotional expressions are regarded as more difficult to recognize than positive ones, which thus may have been a confounding factor in previous studies. Also, ceiling effects are often present on emotion recognition tasks using full-blown emotional facial expressions. In the present study with FTLD patients, we examined the perception of sadness, anger, fear, happiness, surprise and disgust at different emotional intensities on morphed facial expressions to take task difficulty into account. Results showed that our FTLD patients were specifically impaired at the recognition of the emotion anger. Also, the patients performed worse than the controls on recognition of surprise, but performed at control levels on disgust, happiness, sadness and fear. These findings corroborate and extend previous results showing deficits in emotion perception in FTLD.


2019 ◽  
Vol 67 ◽  
pp. 101421
Author(s):  
Sergio Sánchez-Reales ◽  
Carmen Caballero-Peláez ◽  
Javier Prado-Abril ◽  
Félix Inchausti ◽  
María Lado-Codesido ◽  
...  

2021 ◽  
pp. 003329412110184
Author(s):  
Paola Surcinelli ◽  
Federica Andrei ◽  
Ornella Montebarocci ◽  
Silvana Grandi

Aim of the research The literature on emotion recognition from facial expressions shows significant differences in recognition ability depending on the proposed stimulus. Indeed, affective information is not distributed uniformly in the face and recent studies showed the importance of the mouth and the eye regions for a correct recognition. However, previous studies used mainly facial expressions presented frontally and studies which used facial expressions in profile view used a between-subjects design or children faces as stimuli. The present research aims to investigate differences in emotion recognition between faces presented in frontal and in profile views by using a within subjects experimental design. Method The sample comprised 132 Italian university students (88 female, Mage = 24.27 years, SD = 5.89). Face stimuli displayed both frontally and in profile were selected from the KDEF set. Two emotion-specific recognition accuracy scores, viz., frontal and in profile, were computed from the average of correct responses for each emotional expression. In addition, viewing times and response times (RT) were registered. Results Frontally presented facial expressions of fear, anger, and sadness were significantly better recognized than facial expressions of the same emotions in profile while no differences were found in the recognition of the other emotions. Longer viewing times were also found when faces expressing fear and anger were presented in profile. In the present study, an impairment in recognition accuracy was observed only for those emotions which rely mostly on the eye regions.


2014 ◽  
Vol 2014 ◽  
pp. 1-11 ◽  
Author(s):  
Dina Tell ◽  
Denise Davidson ◽  
Linda A. Camras

Eye gaze direction and expression intensity effects on emotion recognition in children with autism disorder and typically developing children were investigated. Children with autism disorder and typically developing children identified happy and angry expressions equally well. Children with autism disorder, however, were less accurate in identifying fear expressions across intensities and eye gaze directions. Children with autism disorder rated expressions with direct eyes, and 50% expressions, as more intense than typically developing children. A trend was also found for sad expressions, as children with autism disorder were less accurate in recognizing sadness at 100% intensity with direct eyes than typically developing children. Although the present research showed that children with autism disorder are sensitive to eye gaze direction, impairments in the recognition of fear, and possibly sadness, exist. Furthermore, children with autism disorder and typically developing children perceive the intensity of emotional expressions differently.


2021 ◽  
Vol 12 ◽  
Author(s):  
Paula J. Webster ◽  
Shuo Wang ◽  
Xin Li

Different styles of social interaction are one of the core characteristics of autism spectrum disorder (ASD). Social differences among individuals with ASD often include difficulty in discerning the emotions of neurotypical people based on their facial expressions. This review first covers the rich body of literature studying differences in facial emotion recognition (FER) in those with ASD, including behavioral studies and neurological findings. In particular, we highlight subtle emotion recognition and various factors related to inconsistent findings in behavioral studies of FER in ASD. Then, we discuss the dual problem of FER – namely facial emotion expression (FEE) or the production of facial expressions of emotion. Despite being less studied, social interaction involves both the ability to recognize emotions and to produce appropriate facial expressions. How others perceive facial expressions of emotion in those with ASD has remained an under-researched area. Finally, we propose a method for teaching FER [FER teaching hierarchy (FERTH)] based on recent research investigating FER in ASD, considering the use of posed vs. genuine emotions and static vs. dynamic stimuli. We also propose two possible teaching approaches: (1) a standard method of teaching progressively from simple drawings and cartoon characters to more complex audio-visual video clips of genuine human expressions of emotion with context clues or (2) teaching in a field of images that includes posed and genuine emotions to improve generalizability before progressing to more complex audio-visual stimuli. Lastly, we advocate for autism interventionists to use FER stimuli developed primarily for research purposes to facilitate the incorporation of well-controlled stimuli to teach FER and bridge the gap between intervention and research in this area.


2021 ◽  
Vol 5 (10) ◽  
pp. 57
Author(s):  
Vinícius Silva ◽  
Filomena Soares ◽  
João Sena Esteves ◽  
Cristina P. Santos ◽  
Ana Paula Pereira

Facial expressions are of utmost importance in social interactions, allowing communicative prompts for a speaking turn and feedback. Nevertheless, not all have the ability to express themselves socially and emotionally in verbal and non-verbal communication. In particular, individuals with Autism Spectrum Disorder (ASD) are characterized by impairments in social communication, repetitive patterns of behaviour, and restricted activities or interests. In the literature, the use of robotic tools is reported to promote social interaction with children with ASD. The main goal of this work is to develop a system capable of automatic detecting emotions through facial expressions and interfacing them with a robotic platform (Zeno R50 Robokind® robotic platform, named ZECA) in order to allow social interaction with children with ASD. ZECA was used as a mediator in social communication activities. The experimental setup and methodology for a real-time facial expression (happiness, sadness, anger, surprise, fear, and neutral) recognition system was based on the Intel® RealSense™ 3D sensor and on facial features extraction and multiclass Support Vector Machine classifier. The results obtained allowed to infer that the proposed system is adequate in support sessions with children with ASD, giving a strong indication that it may be used in fostering emotion recognition and imitation skills.


2021 ◽  
Vol 12 ◽  
Author(s):  
Xiaoxiao Li

In the natural environment, facial and bodily expressions influence each other. Previous research has shown that bodily expressions significantly influence the perception of facial expressions. However, little is known about the cognitive processing of facial and bodily emotional expressions and its temporal characteristics. Therefore, this study presented facial and bodily expressions, both separately and together, to examine the electrophysiological mechanism of emotional recognition using event-related potential (ERP). Participants assessed the emotions of facial and bodily expressions that varied by valence (positive/negative) and consistency (matching/non-matching emotions). The results showed that bodily expressions induced a more positive P1 component and a shortened latency, whereas facial expressions triggered a more negative N170 and prolonged latency. Among N2 and P3, N2 was more sensitive to inconsistent emotional information and P3 was more sensitive to consistent emotional information. The cognitive processing of facial and bodily expressions had distinctive integrating features, with the interaction occurring in the early stage (N170). The results of the study highlight the importance of facial and bodily expressions in the cognitive processing of emotion recognition.


Sign in / Sign up

Export Citation Format

Share Document