Smooth Pursuit Eye Movements in Patients with Impaired Visual Motion Perception

1995 ◽  
pp. 325-329 ◽  
Author(s):  
H. Kimmig ◽  
C. Pinnow ◽  
T. Mergner ◽  
M. Greenlee
2020 ◽  
Author(s):  
Xiuyun Wu ◽  
Austin C. Rothwell ◽  
Miriam Spering ◽  
Anna Montagnini

AbstractSmooth pursuit eye movements and visual motion perception rely on the integration of current sensory signals with past experience. Experience shapes our expectation of current visual events and can drive eye movement responses made in anticipation of a target, such as anticipatory pursuit. Previous research revealed consistent effects of expectation on anticipatory pursuit—eye movements follow the expected target direction or speed—and contrasting effects on motion perception, but most studies considered either eye movement or perceptual responses. The current study directly compared effects of direction expectation on perception and anticipatory pursuit within the same direction discrimination task to investigate whether both types of responses are affected similarly or differently. Observers (n = 10) viewed high-coherence random-dot kinematograms (RDKs) moving rightward and leftward with a probability of 50, 70, or 90% in a given block of trials to build up an expectation of motion direction. They were asked to judge motion direction of interleaved low-coherence RDKs (0-15%). Perceptual judgements were compared to changes in anticipatory pursuit eye movements as a function of probability. Results show that anticipatory pursuit velocity scaled with probability and followed direction expectation (attraction bias), whereas perceptual judgments were biased opposite to direction expectation (repulsion bias). Control experiments suggest that the repulsion bias in perception was not caused by retinal slip induced by anticipatory pursuit, or by motion adaptation. We conclude that direction expectation can be processed differently for perception and anticipatory pursuit.


2007 ◽  
Vol 98 (3) ◽  
pp. 1355-1363 ◽  
Author(s):  
Miriam Spering ◽  
Karl R. Gegenfurtner

The analysis of visual motion serves many different functions ranging from object motion perception to the control of self-motion. The perception of visual motion and the oculomotor tracking of a moving object are known to be closely related and are assumed to be controlled by shared brain areas. We compared perceived velocity and the velocity of smooth pursuit eye movements in human observers in a paradigm that required the segmentation of target object motion from context motion. In each trial, a pursuit target and a visual context were independently perturbed simultaneously to briefly increase or decrease in speed. Observers had to accurately track the target and estimate target speed during the perturbation interval. Here we show that the same motion signals are processed in fundamentally different ways for perception and steady-state smooth pursuit eye movements. For the computation of perceived velocity, motion of the context was subtracted from target motion (motion contrast), whereas pursuit velocity was determined by the motion average (motion assimilation). We conclude that the human motion system uses these computations to optimally accomplish different functions: image segmentation for object motion perception and velocity estimation for the control of smooth pursuit eye movements.


2009 ◽  
Vol 102 (4) ◽  
pp. 2013-2025 ◽  
Author(s):  
Leslie C. Osborne ◽  
Stephen G. Lisberger

To probe how the brain integrates visual motion signals to guide behavior, we analyzed the smooth pursuit eye movements evoked by target motion with a stochastic component. When each dot of a texture executed an independent random walk such that speed or direction varied across the spatial extent of the target, pursuit variance increased as a function of the variance of visual pattern motion. Noise in either target direction or speed increased the variance of both eye speed and direction, implying a common neural noise source for estimating target speed and direction. Spatial averaging was inefficient for targets with >20 dots. Together these data suggest that pursuit performance is limited by the properties of spatial averaging across a noisy population of sensory neurons rather than across the physical stimulus. When targets executed a spatially uniform random walk in time around a central direction of motion, an optimized linear filter that describes the transformation of target motion into eye motion accounted for ∼50% of the variance in pursuit. Filters had widths of ∼25 ms, much longer than the impulse response of the eye, and filter shape depended on both the range and correlation time of motion signals, suggesting that filters were products of sensory processing. By quantifying the effects of different levels of stimulus noise on pursuit, we have provided rigorous constraints for understanding sensory population decoding. We have shown how temporal and spatial integration of sensory signals converts noisy population responses into precise motor responses.


1999 ◽  
Vol 81 (2) ◽  
pp. 596-610 ◽  
Author(s):  
William K. Page ◽  
Charles J. Duffy

MST neuronal responses to heading direction during pursuit eye movements. As you move through the environment, you see a radial pattern of visual motion with a focus of expansion (FOE) that indicates your heading direction. When self-movement is combined with smooth pursuit eye movements, the turning of the eye distorts the retinal image of the FOE but somehow you still can perceive heading. We studied neurons in the medial superior temporal area (MST) of monkey visual cortex, recording responses to FOE stimuli presented during fixation and smooth pursuit eye movements. Almost all neurons showed significant changes in their FOE selective responses during pursuit eye movements. However, the vector average of all the neuronal responses indicated the direction of the FOE during both fixation and pursuit. Furthermore, the amplitude of the net vector increased with increasing FOE eccentricity. We conclude that neuronal population encoding in MST might contribute to pursuit-tolerant heading perception.


2019 ◽  
Vol 7 (14) ◽  
Author(s):  
Seiji Ono ◽  
Kenichiro Miura ◽  
Takashi Kawamura ◽  
Tomohiro Kizuka

2010 ◽  
Vol 50 (24) ◽  
pp. 2729-2739 ◽  
Author(s):  
Kurt Debono ◽  
Alexander C. Schütz ◽  
Miriam Spering ◽  
Karl R. Gegenfurtner

Sign in / Sign up

Export Citation Format

Share Document