scholarly journals Social orienting of children with autism to facial expressions and speech: a study with a wearable eye-tracker in naturalistic settings

2013 ◽  
Vol 4 ◽  
Author(s):  
Silvia Magrelli ◽  
Patrick Jermann ◽  
Basilio Noris ◽  
François Ansermet ◽  
François Hentsch ◽  
...  
2014 ◽  
Vol 2014 ◽  
pp. 1-11 ◽  
Author(s):  
Dina Tell ◽  
Denise Davidson ◽  
Linda A. Camras

Eye gaze direction and expression intensity effects on emotion recognition in children with autism disorder and typically developing children were investigated. Children with autism disorder and typically developing children identified happy and angry expressions equally well. Children with autism disorder, however, were less accurate in identifying fear expressions across intensities and eye gaze directions. Children with autism disorder rated expressions with direct eyes, and 50% expressions, as more intense than typically developing children. A trend was also found for sad expressions, as children with autism disorder were less accurate in recognizing sadness at 100% intensity with direct eyes than typically developing children. Although the present research showed that children with autism disorder are sensitive to eye gaze direction, impairments in the recognition of fear, and possibly sadness, exist. Furthermore, children with autism disorder and typically developing children perceive the intensity of emotional expressions differently.


Author(s):  
Nanako KAJITA ◽  
Kozue SAWADA ◽  
Yukari HASHIMOTO ◽  
Masaharu MARUISHI ◽  
Hiroshi YOSHIDA

2021 ◽  
pp. 1-21
Author(s):  
Michael Vesker ◽  
Daniela Bahn ◽  
Christina Kauschke ◽  
Gudrun Schwarzer

Abstract Social interactions often require the simultaneous processing of emotions from facial expressions and speech. However, the development of the gaze behavior used for emotion recognition, and the effects of speech perception on the visual encoding of facial expressions is less understood. We therefore conducted a word-primed face categorization experiment, where participants from multiple age groups (six-year-olds, 12-year-olds, and adults) categorized target facial expressions as positive or negative after priming with valence-congruent or -incongruent auditory emotion words, or no words at all. We recorded our participants’ gaze behavior during this task using an eye-tracker, and analyzed the data with respect to the fixation time toward the eyes and mouth regions of faces, as well as the time until participants made the first fixation within those regions (time to first fixation, TTFF). We found that the six-year-olds showed significantly higher accuracy in categorizing congruently primed faces compared to the other conditions. The six-year-olds also showed faster response times, shorter total fixation durations, and faster TTFF measures in all primed trials, regardless of congruency, as compared to unprimed trials. We also found that while adults looked first, and longer, at the eyes as compared to the mouth regions of target faces, children did not exhibit this gaze behavior. Our results thus indicate that young children are more sensitive than adults or older children to auditory emotion word primes during the perception of emotional faces, and that the distribution of gaze across the regions of the face changes significantly from childhood to adulthood.


2018 ◽  
Vol 8 (2) ◽  
pp. 10 ◽  
Author(s):  
Alev Girli ◽  
Sıla Doğmaz

In this study, children with learning disability (LD) were compared with children with autism spectrum disorder(ASD) in terms of identifying emotions from photographs with certain face and body expressions. The sampleconsisted of a total of 82 children aged 7-19 years living in Izmir in Turkey. A total of 6 separate sets of slides,consisting of black and white photographs, were used to assess participants’ ability to identify feelings – 3 sets forfacial expressions, and 3 sets for body language. There were 20 photographs on the face slides and 38 photographson the body language slides. The results of the nonparametric Mann Whitney-U test showed no significant differencebetween the total scores that children received from each of the face and body language slide sets. It was observedthat the children with LD usually looked at the whole photo, while the children with ASD focused especially aroundthe mouth to describe feelings. The results that were obtained were discussed in the context of the literature, andsuggestions were presented.


2019 ◽  
Vol 2019 ◽  
pp. 1-16 ◽  
Author(s):  
Leandro M. Almeida ◽  
Diego P. da Silva ◽  
Daieny P. Theodório ◽  
Wolley W. Silva ◽  
Silvia Cristina M. Rodrigues ◽  
...  

This paper presents a computer game developed to assist children with Autism Spectrum Disorder (ASD) to recognize facial expressions associated with the four basic emotions: joy, sadness, anger, and surprise. This game named ALTRIRAS is a role-playing game (RPG), a kind of game pointed out by the literature as the most suitable for these children for being more social than competitive. It has recreational settings built with 2D graphic interface to keep the children’s attention and an access control and a register mechanism to allow the monitoring of the child’s progress. The data collection of the functional, nonfunctional, psychological, and educational requirements, as well as the evaluation of its consistency and usability, was made by a multidisciplinary team consisting of five experts in each of the following expertises: pedagogy, psychology, psychopedagogy, and game development. The effectiveness test of the game was performed by 10 children with ASD and 28 children with neurotypical development, which were separated into control and experimental groups, respectively. All experts and children with neurotypical development answered the System Usability Scale (SUS) questionnaire after playing the game. The results were positive, between experts and volunteers regarding their acceptance. However, the time of exposure to the game in children with ASD should be increased to effective assistance in the recognition of facial expressions.


Sign in / Sign up

Export Citation Format

Share Document