scholarly journals Dissociation of Early And Late Face-Related Processes In Autism Spectrum Disorder And Williams Syndrome

Author(s):  
Alice Gomez ◽  
Guillaume Lio ◽  
Manuela Costa ◽  
Angela Sirigu ◽  
Caroline Demily

Abstract Background: Williams syndrome (WS) and Autism Spectrum Disorders (ASD) are psychiatric conditions associated with atypical but opposite face-to-face interactions patterns: WS patients overly stare at others, ASD individuals escape eye contact. Whether these behaviors result from dissociable visual processes within the occipito-temporal pathways is unknown. Using high-density electroencephalography, multivariate signal processing algorithms and a protocol designed to identify and extract evoked activities sensitive to facial cues, we investigated how WS (N=14), ASD (N=14) and neurotypical subjects (N=14) decode the information content of a face stimulus. Results: We found two neural components in neurotypical participants, both strongest when the eye region was projected onto the subject's fovea, simulating a direct eye contact situation, and weakest over more distant regions, reaching a minimum when the focused region was outside the stimulus face. The first component peaks at 170ms, an early signal known to be implicated in low-level face features. The second is identified later, 260ms post-stimulus onset and is implicated in decoding salient face social cues.Remarkably, both components were found distinctly impaired and preserved in WS and ASD. In WS, we could weakly decode the 170ms signal based on our regressor relative to facial features, probably due to their relatively poor ability to process faces’ morphology, while the late 260ms component was highly significant. The reverse pattern was observed in ASD participants who showed neurotypical like early 170ms evoked activity but impaired late evoked 260ms signal. Conclusions: Our study reveals a dissociation between WS and ASD patients and point at different neural origins for their social impairments.

2021 ◽  
Author(s):  
Alice Gomez ◽  
Guillaume Lio ◽  
Manuela Costa ◽  
Angela Sirigu ◽  
Caroline Demily

Abstract Williams syndrome (WS) and Autism Spectrum Disorders (ASD) are psychiatric conditions associated with atypical but opposite face-to-face interactions patterns: WS patients overly stare at others, ASD individuals escape eye contact. Whether these behaviors result from dissociable visual processes within the occipito-temporal pathways is unknown. Using high-density electroencephalography, multivariate pattern classification and group blind source separation, we searched for face-related neural signals that could best discriminate WS (N = 14), ASD (N = 14) and neurotypical populations (N = 14). We found two peaks in neurotypical participants: the first at 170ms, an early signal known to be implicated in low-level face features, the second at 260ms, a late component implicated in decoding salient face social cues. The late 260ms signal varied as a function of the distance of the eyes in the face stimulus with respect to the viewers’ fovea, meaning that it was strongest when the eyes were projected on the fovea and weakest when projected in the retinal periphery. Remarkably, both components were found distinctly impaired and preserved in WS and ASD. In WS, we could weakly decode the 170ms signal probably due to their relatively poor ability to process faces’ morphology while the late 260ms component shown to be eye sensitive was highly significant. The reverse pattern was observed in ASD participants who showed neurotypical like early 170ms evoked activity but impaired late evoked 260ms signal. Our study reveals a dissociation between WS and ASD patients and point at different neural origins for their social impairments.


2009 ◽  
Vol 39 (11) ◽  
pp. 1598-1602 ◽  
Author(s):  
Atsushi Senju ◽  
Yukiko Kikuchi ◽  
Hironori Akechi ◽  
Toshikazu Hasegawa ◽  
Yoshikuni Tojo ◽  
...  

2014 ◽  
Vol 61 (1) ◽  
pp. 49-55 ◽  
Author(s):  
Ana Osório ◽  
Adriana Sampaio ◽  
Rocío Martínez Regueiro ◽  
Elena Garayzábal Heinze ◽  
Ángel Carracedo ◽  
...  

2021 ◽  
Author(s):  
Kinga Farkas ◽  
Orsolya Pesthy ◽  
Anna Guttengeber ◽  
Anna Szonja Weigl ◽  
Andras Veres ◽  
...  

Interpersonal distance regulation is an essential element of social communication. Its impairment in autism spectrum disorder (ASD) is widely acknowledged among practitioners, but only a handful of studies reported empirical research in real-life settings focusing only on children. However, these studies did not measure the alterations of vegetative functions related to interpersonal distance. Here, we introduced a new experimental design to systematically measure interpersonal distance along with heart rate variability (HRV) in adults with ASD and tested the modulatory effect of intentionality, eye contact, moving activity, and attribution. Twenty-two adults diagnosed with ASD and 21 matched neurotypical controls participated in our study from 2019 October to 2020 February. Our new experimental design combined the modified version of the stop distance paradigm with HRV measurement controlling for eye contact between the experimenter and the participant to measure interpersonal distance in incidental and intentional conditions. Our results showed greater preferred distance in ASD in the intentional but not in the incidental condition. These results were altered with eye contact and the participant's role (active vs. passive) in the stop distance task. Moreover, we found lower baseline HRV and reduced HRV reactivity in ASD; however, these vegetative measurements could not predict preferred interpersonal distance. Our study highlights the importance of interpersonal space regulation in ASD and the need for sophisticated experimental designs to grasp the complexity and underlying factors of distance regulation in typical and atypical populations.


2020 ◽  
Author(s):  
Eunji Chong ◽  
Elysha Clark-Whitney ◽  
Audrey Southerland ◽  
Elizabeth Stubbs ◽  
Chanel Miller ◽  
...  

Eye contact is among the most primary means of social communication that humans use from the first months of life. Quantification of eye contact is valuable in various scenarios as a part of the analysis of social roles, communication skills, and medical screening. Estimating a subject's looking direction from video is a challenging task, but eye contact can be effectively captured by a wearable point-of-view camera which provides a unique viewpoint as a result of its configuration. While moments of eye contact from this viewpoint can be hand coded, such process tends to be laborious and subjective. In this work, we developed the first deep neural network model to automatically detect eye contact in egocentric video with accuracy equivalent to that of human experts. We trained a deep convolutional neural network using a dataset of 4,339,879 annotated images, consisting of 103 subjects with diverse demographic backgrounds. 57 have a diagnosis of Autism Spectrum Disorder. The network achieves overall precision 0.936 and recall 0.943 on 18 set-aside validation subjects, and performance is on par with 10 trained human coders with a mean precision 0.918 and recall 0.946. This result passes class equivalence tests in Cohen’s kappa scores (equivalence boundary of 0.025, p < .005), demonstrating that deep learning model can produce automated coding with a level of reliability comparable to human coders. The presented method will be instrumental in analyzing gaze behavior in naturalistic social settings by serving as a scalable, objective, and accessible tool for clinicians and researchers.


2021 ◽  
Vol 11 (12) ◽  
pp. 1555
Author(s):  
Gianpaolo Alvari ◽  
Luca Coviello ◽  
Cesare Furlanello

The high level of heterogeneity in Autism Spectrum Disorder (ASD) and the lack of systematic measurements complicate predicting outcomes of early intervention and the identification of better-tailored treatment programs. Computational phenotyping may assist therapists in monitoring child behavior through quantitative measures and personalizing the intervention based on individual characteristics; still, real-world behavioral analysis is an ongoing challenge. For this purpose, we designed EYE-C, a system based on OpenPose and Gaze360 for fine-grained analysis of eye-contact episodes in unconstrained therapist-child interactions via a single video camera. The model was validated on video data varying in resolution and setting, achieving promising performance. We further tested EYE-C on a clinical sample of 62 preschoolers with ASD for spectrum stratification based on eye-contact features and age. By unsupervised clustering, three distinct sub-groups were identified, differentiated by eye-contact dynamics and a specific clinical phenotype. Overall, this study highlights the potential of Artificial Intelligence in categorizing atypical behavior and providing translational solutions that might assist clinical practice.


Sign in / Sign up

Export Citation Format

Share Document