scholarly journals Uncertainty modulates visual maps during non-instrumental information demand

2021 ◽  
Author(s):  
Yvonne Li ◽  
Nabil Daddaoua ◽  
Mattias Horan ◽  
Jacqueline Gottlieb

Animals are intrinsically motivated to resolve uncertainty and predict future events. This motivation is encoded in cortical and subcortical structures, but a key open question is how it generates concrete policies for attending to informative stimuli. We examined this question using neural recordings in the monkey lateral intraparietal area (LIP), a visual area implicated in attention and gaze, during non-instrumental information demand. We show that the uncertainty that was resolved by a visual cue enhanced visuo-spatial responses of LIP cells independently of reward probability. This enhancement was independent of immediate saccade plans but correlated with the sensitivity to uncertainty in eye movement behavior on longer time scales (across sessions/days). The findings suggest that topographic visual maps receive motivational signals of uncertainty, which enhance the priority of informative stimuli and the likelihood that animals will orient to the stimuli to reduce uncertainty.

2009 ◽  
Author(s):  
Polina M. Vanyukov ◽  
Erik D. Reichle ◽  
Tessa Warren

2019 ◽  
Vol 59 ◽  
pp. 254-258 ◽  
Author(s):  
Hui Liu ◽  
Ruwei Ou ◽  
Qianqian Wei ◽  
Yanbing Hou ◽  
Bei Cao ◽  
...  

2017 ◽  
Vol 13 (7S_Part_14) ◽  
pp. P709-P710
Author(s):  
Marta Luisa Goncalves de Freitas Pereira ◽  
Marina von Zuben de Arruda Camargo ◽  
Jéssica dos Santos ◽  
Fátima L.S. Nunes ◽  
Orestes Vicente Forlenza

2020 ◽  
Author(s):  
Šimon Kucharský ◽  
Daan Roelof van Renswoude ◽  
Maartje Eusebia Josefa Raijmakers ◽  
Ingmar Visser

Describing, analyzing and explaining patterns in eye movement behavior is crucial for understanding visual perception. Further, eye movements are increasingly used in informing cognitive process models. In this article, we start by reviewing basic characteristics and desiderata for models of eye movements. Specifically, we argue that there is a need for models combining spatial and temporal aspects of eye-tracking data (i.e., fixation durations and fixation locations), that formal models derived from concrete theoretical assumptions are needed to inform our empirical research, and custom statistical models are useful for detecting specific empirical phenomena that are to be explained by said theory. In this article, we develop a conceptual model of eye movements, or specifically, fixation durations and fixation locations, and from it derive a formal statistical model --- meeting our goal of crafting a model useful in both the theoretical and empirical research cycle. We demonstrate the use of the model on an example of infant natural scene viewing, to show that the model is able to explain different features of the eye movement data, and to showcase how to identify that the model needs to be adapted if it does not agree with the data. We conclude with discussion of potential future avenues for formal eye movement models.


2021 ◽  
Author(s):  
Zezhong Lv ◽  
Qing Xu ◽  
Klaus Schoeffmann ◽  
Simon Parkinson

AbstractEye movement behavior, which provides the visual information acquisition and processing, plays an important role in performing sensorimotor tasks, such as driving, by human beings in everyday life. In the procedure of performing sensorimotor tasks, eye movement is contributed through a specific coordination of head and eye in gaze changes, with head motions preceding eye movements. Notably we believe that this coordination in essence indicates a kind of causality. In this paper, we investigate transfer entropy to set up a quantity for measuring an unidirectional causality from head motion to eye movement. A normalized version of the proposed measure, demonstrated by virtual reality based psychophysical studies, behaves very well as a proxy of driving performance, suggesting that quantitative exploitation of coordination of head and eye may be an effective behaviometric of sensorimotor activity.


Author(s):  
Hayward J. Godwin ◽  
Michael C. Hout ◽  
Katrín J. Alexdóttir ◽  
Stephen C. Walenchok ◽  
Anthony S. Barnhart

AbstractExamining eye-movement behavior during visual search is an increasingly popular approach for gaining insights into the moment-to-moment processing that takes place when we look for targets in our environment. In this tutorial review, we describe a set of pitfalls and considerations that are important for researchers – both experienced and new to the field – when engaging in eye-movement and visual search experiments. We walk the reader through the research cycle of a visual search and eye-movement experiment, from choosing the right predictions, through to data collection, reporting of methodology, analytic approaches, the different dependent variables to analyze, and drawing conclusions from patterns of results. Overall, our hope is that this review can serve as a guide, a talking point, a reflection on the practices and potential problems with the current literature on this topic, and ultimately a first step towards standardizing research practices in the field.


2013 ◽  
Vol 89 ◽  
pp. 32-38 ◽  
Author(s):  
William Poynter ◽  
Megan Barber ◽  
Jason Inman ◽  
Coral Wiggins

1979 ◽  
Vol 11 (4) ◽  
pp. 319-328 ◽  
Author(s):  
Lester A. Lefton ◽  
Richard J. Nagle ◽  
Gwendolyn Johnson ◽  
Dennis F. Fisher

While reading text, the eye movements of good and poor reading fifth graders, third graders and adults were assessed. Subjects were tested in two sessions one year apart. Dependent variables included the duration and frequency of forward going fixations and regressions; an analysis of individual differences was also made. Results showed that poor reading fifth graders have relatively unsystematic eye movement behavior with many more fixations of longer duration than other fifth graders and adults. The eye movements of poor readers are quantitatively and qualitatively different than those of normal readers.


Author(s):  
Zhongqi Liu ◽  
Zhaofang Xu ◽  
Qianxiang Zhou ◽  
Fang Xie ◽  
Shihua Zhou

Sign in / Sign up

Export Citation Format

Share Document