Real-time lexical comprehension in young children learning American Sign Language
When children interpret spoken language in real time, linguistic information drives rapidshifts in visual attention to objects in the visual world. This language-vision interaction canprovide insights into children's developing efficiency in language comprehension. But how doeslanguage influence visual attention when the linguistic signal and the visual world are bothprocessed via the visual channel? Here, we measured eye movements during real-timecomprehension of a visual-manual language, American Sign Language (ASL), by 29 nativeASL-learning children (16-53 mos, 16 deaf, 13 hearing) and 16 fluent deaf adult signers. Allsigners showed evidence of rapid, incremental language comprehension, tending to initiate aneye movement before sign offset. Deaf and hearing ASL-learners showed similar gaze patterns,suggesting that the in-the-moment dynamics of eye movements during ASL processing areshaped by the constraints of processing a visual language in real time and not by differentialaccess to auditory information in day-to-day life. Finally, variation in children’s ASL processingwas positively correlated with age and vocabulary size. Thus, despite competition for attentionwithin a single modality, the timing and accuracy of visual fixations during ASL comprehensionreflect information processing skills that are fundamental for language acquisition regardless oflanguage modality.