scholarly journals Using Eye Movements to Assess Language Comprehension in Toddlers Born Preterm and Full Term

2017 ◽  
Vol 180 ◽  
pp. 124-129 ◽  
Author(s):  
Elizabeth C. Loi ◽  
Virginia A. Marchman ◽  
Anne Fernald ◽  
Heidi M. Feldman
Author(s):  
Michael K. Tanenhaus

Recently, eye movements have become a widely used response measure for studying spoken language processing in both adults and children, in situations where participants comprehend and generate utterances about a circumscribed “Visual World” while fixation is monitored, typically using a free-view eye-tracker. Psycholinguists now use the Visual World eye-movement method to study both language production and language comprehension, in studies that run the gamut of current topics in language processing. Eye movements are a response measure of choice for addressing many classic questions about spoken language processing in psycholinguistics. This article reviews the burgeoning Visual World literature on language comprehension, highlighting some of the seminal studies and examining how the Visual World approach has contributed new insights to our understanding of spoken word recognition, parsing, reference resolution, and interactive conversation. It considers some of the methodological issues that come to the fore when psycholinguists use eye movements to examine spoken language comprehension.


1995 ◽  
Vol 24 (6) ◽  
pp. 409-436 ◽  
Author(s):  
Kathleen M. Eberhard ◽  
Michael J. Spivey-Knowlton ◽  
Julie C. Sedivy ◽  
Michael K. Tanenhaus

1989 ◽  
Vol 4 (3-4) ◽  
pp. SI21-SI49 ◽  
Author(s):  
Keith Rayner ◽  
Sara C. Sereno ◽  
Robin K. Morris ◽  
A. Réne Schmauder ◽  
Charles Clifton

2021 ◽  
pp. 095679762199114
Author(s):  
Jason C. Coronel ◽  
Olivia M. Bullock ◽  
Hillary C. Shulman ◽  
Matthew D. Sweitzer ◽  
Robert M. Bond ◽  
...  

More than 100 countries allow people to vote directly on policies in direct democracy elections (e.g., 2016 Brexit referendum). Politicians are often responsible for writing ballot language, and voters frequently encounter ballot measures that are difficult to understand. We examined whether eye movements from a small group of individuals can predict the consequences of ballot language on large-scale voting decisions. Across two preregistered studies (Study 1: N = 120 registered voters, Study 2: N = 120 registered voters), we monitored laboratory participants’ eye movements as they read real ballot measures. We found that eye-movement responses associated with difficulties in language comprehension predicted aggregate voting decisions to abstain from voting and vote against ballot measures in U.S. elections (total number of votes cast = 137,661,232). Eye movements predicted voting decisions beyond what was accounted for by widely used measures of language difficulty. This finding demonstrates a new way of linking eye movements to out-of-sample aggregate-level behaviors.


2018 ◽  
Author(s):  
Kyle Earl MacDonald ◽  
Virginia Marchman ◽  
Anne Fernald ◽  
Michael C. Frank

During grounded language comprehension, listeners must link the incoming linguistic signal to the visual world despite noise in the input. Information gathered through visual fixations can facilitate understanding. But do listeners flexibly seek supportive visual information? Here, we propose that even young children can adapt their gaze and actively gather information that supports their language understanding. We present two case studies of eye movements during real-time language processing where the value of fixating on a social partner varies across different contexts. First, compared to children learning spoken English (n=80), young American Sign Language (ASL) learners (n=30) delayed gaze shifts away from a language source and produced a higher proportion of language-consistent eye movements. This result suggests that ASL learners adapt to dividing attention between language and referents, which both compete for processing via the same channel: vision. Second, English-speaking preschoolers (n=39) and adults (n=31) delayed the timing of gaze shifts away from a speaker’s face while processing language in a noisy auditory environment. This delay resulted in a higher proportion of language-consistent gaze shifts. These results suggest that young listeners can adapt their gaze to seek supportive visual information from social partners during real-time language comprehension.


2021 ◽  
Author(s):  
Isabelle Dautriche ◽  
Louise Goupil ◽  
Kenny Smith ◽  
Hugh Rabagliati

We study the fundamental issue of whether children evaluate the reliability of their language interpretation, i.e., their confidence in understanding words. In two experiments, two-year- olds (n1 = 50; n2 = 60) saw two objects and heard one of them being named; both objects were then hidden behind screens and children were asked to look towards the named object, which was eventually revealed. When children knew the label used, they showed increased post-decision persistence after a correct compared to an incorrect anticipatory look, a marker of decision confidence in word comprehension (experiment 1). When interacting with an unreliable speaker, children showed accurate word comprehension, but reduced confidence in the accuracy of their own choice, indicating that children’s confidence estimates are influenced by social information (experiment 2). Thus, by 2 years, children can estimate their confidence during language comprehension, long before they can reflect upon and talk about their linguistic skills.


1994 ◽  
Author(s):  
Michael K. Tanenhaus ◽  
Michael Spiveyknowlton ◽  
Kathleen M. Eberhard ◽  
Julie Sedivy

Sign in / Sign up

Export Citation Format

Share Document