BACKGROUND
Abnormal gaze behavior is a prominent feature of the autism spectrum disorder (ASD). Previous eye tracking studies had participants watch images (i.e., picture, video and webpage), and the application of machine learning (ML) on these data showed promising results in identify ASD individuals. Given the fact that gaze behavior differs in face-to-face interaction from image viewing tasks, no study has investigated whether natural social gaze behavior could accurately identify ASD.
OBJECTIVE
The objective of this study was to examine whether and what area of interest (AOI)-based features extracted from the natural social gaze behavior could identify ASD.
METHODS
Both children with ASD and typical development (TD) were eye-tracked when they were engaged in a face-to-face conversation with an interviewer. Four ML classifiers (support vector machine, SVM; linear discriminant analysis, LDA; decision tree, DT; and random forest, RF) were used to determine the maximum classification accuracy and the corresponding features.
RESULTS
A maximum classification accuracy of 84.62% were achieved with three classifiers (LDA, DT and RF). Results showed that the mouth, but not the eyes AOI, was a powerful feature in detecting ASD.
CONCLUSIONS
Natural gaze behavior could be leveraged to identify ASD, suggesting that ASD might be objectively screened with eye tracking technology in everyday social interaction. In addition, the comparison between our and previous findings suggests that eye tracking features that could identify ASD might be culture dependent and context sensitive.