Smooth pursuit eye movements and perception share target selection, but only some central resources

2009 ◽  
Vol 201 (1) ◽  
pp. 66-73 ◽  
Author(s):  
Dirk Kerzel ◽  
Sabine Born ◽  
David Souto
2004 ◽  
Vol 155 (1) ◽  
pp. 129-133 ◽  
Author(s):  
E. Poliakoff ◽  
C. J. S. Collins ◽  
G. R. Barnes

1997 ◽  
Vol 14 (2) ◽  
pp. 323-338 ◽  
Author(s):  
Vincent P. Ferrera ◽  
Stephen G. Lisberger

AbstractAs a step toward understanding the mechanism by which targets are selected for smooth-pursuit eye movements, we examined the behavior of the pursuit system when monkeys were presented with two discrete moving visual targets. Two rhesus monkeys were trained to select a small moving target identified by its color in the presence of a moving distractor of another color. Smooth-pursuit eye movements were quantified in terms of the latency of the eye movement and the initial eye acceleration profile. We have previously shown that the latency of smooth pursuit, which is normally around 100 ms, can be extended to 150 ms or shortened to 85 ms depending on whether there is a distractor moving in the opposite or same direction, respectively, relative to the direction of the target. We have now measured this effect for a 360 deg range of distractor directions, and distractor speeds of 5–45 deg/s. We have also examined the effect of varying the spatial separation and temporal asynchrony between target and distractor. The results indicate that the effect of the distractor on the latency of pursuit depends on its direction of motion, and its spatial and temporal proximity to the target, but depends very little on the speed of the distractor. Furthermore, under the conditions of these experiments, the direction of the eye movement that is emitted in response to two competing moving stimuli is not a vectorial combination of the stimulus motions, but is solely determined by the direction of the target. The results are consistent with a competitive model for smooth-pursuit target selection and suggest that the competition takes place at a stage of the pursuit pathway that is between visual-motion processing and motor-response preparation.


2009 ◽  
Vol 21 (8) ◽  
pp. 1611-1627 ◽  
Author(s):  
Krishna Srihasam ◽  
Daniel Bullock ◽  
Stephen Grossberg

Oculomotor tracking of moving objects is an important component of visually based cognition and planning. Such tracking is achieved by a combination of saccades and smooth-pursuit eye movements. In particular, the saccadic and smooth-pursuit systems interact to often choose the same target, and to maximize its visibility through time. How do multiple brain regions interact, including frontal cortical areas, to decide the choice of a target among several competing moving stimuli? How is target selection information that is created by a bias (e.g., electrical stimulation) transferred from one movement system to another? These saccade–pursuit interactions are clarified by a new computational neural model, which describes interactions between motion processing areas: the middle temporal area, the middle superior temporal area, the frontal pursuit area, and the dorsal lateral pontine nucleus; saccade specification, selection, and planning areas: the lateral intraparietal area, the frontal eye fields, the substantia nigra pars reticulata, and the superior colliculus; the saccadic generator in the brain stem; and the cerebellum. Model simulations explain a broad range of neuroanatomical and neurophysiological data. These results are in contrast with the simplest parallel model with no interactions between saccades and pursuit other than common-target selection and recruitment of shared motoneurons. Actual tracking episodes in primates reveal multiple systematic deviations from predictions of the simplest parallel model, which are explained by the current model.


2011 ◽  
Vol 70 ◽  
pp. 352-352 ◽  
Author(s):  
K Strand Brodd ◽  
K Rosander ◽  
H Grönqvist ◽  
G Holmström ◽  
B Strömberg ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document