scholarly journals Are tracking eye movements driven by an internal model of target motion ?

2021 ◽  
Vol 21 (9) ◽  
pp. 1839
Author(s):  
Julie Quinet ◽  
Laurent Goffart
2009 ◽  
Vol 102 (4) ◽  
pp. 2013-2025 ◽  
Author(s):  
Leslie C. Osborne ◽  
Stephen G. Lisberger

To probe how the brain integrates visual motion signals to guide behavior, we analyzed the smooth pursuit eye movements evoked by target motion with a stochastic component. When each dot of a texture executed an independent random walk such that speed or direction varied across the spatial extent of the target, pursuit variance increased as a function of the variance of visual pattern motion. Noise in either target direction or speed increased the variance of both eye speed and direction, implying a common neural noise source for estimating target speed and direction. Spatial averaging was inefficient for targets with >20 dots. Together these data suggest that pursuit performance is limited by the properties of spatial averaging across a noisy population of sensory neurons rather than across the physical stimulus. When targets executed a spatially uniform random walk in time around a central direction of motion, an optimized linear filter that describes the transformation of target motion into eye motion accounted for ∼50% of the variance in pursuit. Filters had widths of ∼25 ms, much longer than the impulse response of the eye, and filter shape depended on both the range and correlation time of motion signals, suggesting that filters were products of sensory processing. By quantifying the effects of different levels of stimulus noise on pursuit, we have provided rigorous constraints for understanding sensory population decoding. We have shown how temporal and spatial integration of sensory signals converts noisy population responses into precise motor responses.


2006 ◽  
Vol 16 (1-2) ◽  
pp. 1-22 ◽  
Author(s):  
Junko Fukushima ◽  
Teppei Akao ◽  
Sergei Kurkin ◽  
Chris R.S. Kaneko ◽  
Kikuro Fukushima

In order to see clearly when a target is moving slowly, primates with high acuity foveae use smooth-pursuit and vergence eye movements. The former rotates both eyes in the same direction to track target motion in frontal planes, while the latter rotates left and right eyes in opposite directions to track target motion in depth. Together, these two systems pursue targets precisely and maintain their images on the foveae of both eyes. During head movements, both systems must interact with the vestibular system to minimize slip of the retinal images. The primate frontal cortex contains two pursuit-related areas; the caudal part of the frontal eye fields (FEF) and supplementary eye fields (SEF). Evoked potential studies have demonstrated vestibular projections to both areas and pursuit neurons in both areas respond to vestibular stimulation. The majority of FEF pursuit neurons code parameters of pursuit such as pursuit and vergence eye velocity, gaze velocity, and retinal image motion for target velocity in frontal and depth planes. Moreover, vestibular inputs contribute to the predictive pursuit responses of FEF neurons. In contrast, the majority of SEF pursuit neurons do not code pursuit metrics and many SEF neurons are reported to be active in more complex tasks. These results suggest that FEF- and SEF-pursuit neurons are involved in different aspects of vestibular-pursuit interactions and that eye velocity coding of SEF pursuit neurons is specialized for the task condition.


Author(s):  
Ryan E. B. Mruczek ◽  
D. Blair Christopher ◽  
Lars Strother ◽  
Gideon P. Caplovitz

Static size contrast and assimilation illusions, such as the Ebbinghaus and Delboeuf illusions, show that the size of nearby objects in a scene can influence the perceived size of a central target. This chapter describes a dynamic variant of these classic size illusions, called the Dynamic Illusory Size-Contrast (DISC) effect. In the DISC effect, a surrounding stimulus that continuously changes size causes an illusory size change in a central target. The effect is dramatically enhanced in the presence of additional stimulus dynamics arising from eye movements or target motion. The chapter proposes that this surprisingly powerful effect of motion on perceived size depends on the degree of uncertainty inherent in the size of the retinal image of a moving object.


Sign in / Sign up

Export Citation Format

Share Document