scholarly journals Rhesus Monkeys Behave As If They Perceive the Duncker Illusion

2005 ◽  
Vol 17 (7) ◽  
pp. 1011-1017 ◽  
Author(s):  
A. Z. Zivotofsky ◽  
M. E. Goldberg ◽  
K. D. Powell

The visual system uses the pattern of motion on the retina to analyze the motion of objects in the world, and the motion of the observer him/herself. Distinguishing between retinal motion evoked by movement of the retina in space and retinal motion evoked by movement of objects in the environment is computationally difficult, and the human visual system frequently misinterprets the meaning of retinal motion. In this study, we demonstrate that the visual system of the Rhesus monkey also misinterprets retinal motion. We show that monkeys erroneously report the trajectories of pursuit targets or their own pursuit eye movements during an epoch of smooth pursuit across an orthogonally moving background. Furthermore, when they make saccades to the spatial location of stimuli that flashed early in an epoch of smooth pursuit or fixation, they make large errors that appear to take into account the erroneous smooth eye movement that they report in the first experiment, and not the eye movement that they actually make.

2019 ◽  
Vol 121 (5) ◽  
pp. 1787-1797
Author(s):  
David Souto ◽  
Jayesha Chudasama ◽  
Dirk Kerzel ◽  
Alan Johnston

Smooth pursuit eye movements (pursuit) are used to minimize the retinal motion of moving objects. During pursuit, the pattern of motion on the retina carries not only information about the object movement but also reafferent information about the eye movement itself. The latter arises from the retinal flow of the stationary world in the direction opposite to the eye movement. To extract the global direction of motion of the tracked object and stationary world, the visual system needs to integrate ambiguous local motion measurements (i.e., the aperture problem). Unlike the tracked object, the stationary world’s global motion is entirely determined by the eye movement and thus can be approximately derived from motor commands sent to the eye (i.e., from an efference copy). Because retinal motion opposite to the eye movement is dominant during pursuit, different motion integration mechanisms might be used for retinal motion in the same direction and opposite to pursuit. To investigate motion integration during pursuit, we tested direction discrimination of a brief change in global object motion. The global motion stimulus was a circular array of small static apertures within which one-dimensional gratings moved. We found increased coherence thresholds and a qualitatively different reflexive ocular tracking for global motion opposite to pursuit. Both effects suggest reduced sampling of motion opposite to pursuit, which results in an impaired ability to extract coherence in motion signals in the reafferent direction. We suggest that anisotropic motion integration is an adaptation to asymmetric retinal motion patterns experienced during pursuit eye movements. NEW & NOTEWORTHY This study provides a new understanding of how the visual system achieves coherent perception of an object’s motion while the eyes themselves are moving. The visual system integrates local motion measurements to create a coherent percept of object motion. An analysis of perceptual judgments and reflexive eye movements to a brief change in an object’s global motion confirms that the visual and oculomotor systems pick fewer samples to extract global motion opposite to the eye movement.


2003 ◽  
Vol 90 (3) ◽  
pp. 1489-1502 ◽  
Author(s):  
Uwe J. Ilg ◽  
Peter Thier

Because smooth-pursuit eye movements (SPEM) can be executed only in the presence of a moving target, it has been difficult to attribute the neuronal activity observed during the execution of these eye movements to either sensory processing or to motor preparation or execution. Previously, we showed that rhesus monkeys can be trained to perform SPEM directed toward an “imaginary” target defined by visual cues confined to the periphery of the visual field. The pursuit of an “imaginary” target provides the opportunity to elicit SPEM without stimulating visual receptive fields confined to the center of the visual field. Here, we report that a subset of neurons [85 “ imaginary” visual tracking (iVT)-neurons] in area MST of 3 rhesus monkeys were identically activated during pursuit of a conventional, foveal dot target and the “imaginary” target. Because iVT-neurons did not respond to the presentation of a moving “imaginary” target during fixation of a stationary dot, we are able to exclude that responses to pursuit of the “imaginary” target were artifacts of stimulation of the visual field periphery. Neurons recorded from the representation of the central parts of the visual field in neighboring area MT, usually vigorously discharging during pursuit of foveal targets, in no case responded to pursuit of the “imaginary” target. This dissociation between MT and MST neurons supports the view that pursuit responses of MT neurons are the result of target image motion, whereas those of iVT-neurons in area MST reflect an eye movement–related signal that is nonretinal in origin. iVT-neurons fell into two groups, depending on the properties of the eye movement–related signal. Whereas most of them (71%) encoded eye velocity, a minority showed responses determined by eye position, irrespective of whether eye position was changed by smooth pursuit or by saccades. Only the former group exhibited responses that led the eye movement, which is a prerequisite for a causal role in the generation of SPEM.


2009 ◽  
Vol 101 (2) ◽  
pp. 934-947 ◽  
Author(s):  
Masafumi Ohki ◽  
Hiromasa Kitazawa ◽  
Takahito Hiramatsu ◽  
Kimitake Kaga ◽  
Taiko Kitamura ◽  
...  

The anatomical connection between the frontal eye field and the cerebellar hemispheric lobule VII (H-VII) suggests a potential role of the hemisphere in voluntary eye movement control. To reveal the involvement of the hemisphere in smooth pursuit and saccade control, we made a unilateral lesion around H-VII and examined its effects in three Macaca fuscata that were trained to pursue visually a small target. To the step (3°)-ramp (5–20°/s) target motion, the monkeys usually showed an initial pursuit eye movement at a latency of 80–140 ms and a small catch-up saccade at 140–220 ms that was followed by a postsaccadic pursuit eye movement that roughly matched the ramp target velocity. After unilateral cerebellar hemispheric lesioning, the initial pursuit eye movements were impaired, and the velocities of the postsaccadic pursuit eye movements decreased. The onsets of 5° visually guided saccades to the stationary target were delayed, and their amplitudes showed a tendency of increased trial-to-trial variability but never became hypo- or hypermetric. Similar tendencies were observed in the onsets and amplitudes of catch-up saccades. The adaptation of open-loop smooth pursuit velocity, tested by a step increase in target velocity for a brief period, was impaired. These lesion effects were recognized in all directions, particularly in the ipsiversive direction. A recovery was observed at 4 wk postlesion for some of these lesion effects. These results suggest that the cerebellar hemispheric region around lobule VII is involved in the control of smooth pursuit and saccadic eye movements.


1999 ◽  
Vol 88 (3) ◽  
pp. 209-219 ◽  
Author(s):  
Gunvant K. Thaker ◽  
David E. Ross ◽  
Robert W. Buchanan ◽  
Helene M. Adami ◽  
Deborah R. Medoff

2017 ◽  
Vol 118 (2) ◽  
pp. 986-1001 ◽  
Author(s):  
Ramanujan T. Raghavan ◽  
Stephen G. Lisberger

The midline oculomotor cerebellum plays a different role in smooth pursuit eye movements compared with the lateral, floccular complex and appears to be much less involved in direction learning in pursuit. The output from the oculomotor vermis during pursuit lies along a null-axis for saccades and vice versa. Thus the vermis can play independent roles in the two kinds of eye movement.


1997 ◽  
Vol 14 (2) ◽  
pp. 323-338 ◽  
Author(s):  
Vincent P. Ferrera ◽  
Stephen G. Lisberger

AbstractAs a step toward understanding the mechanism by which targets are selected for smooth-pursuit eye movements, we examined the behavior of the pursuit system when monkeys were presented with two discrete moving visual targets. Two rhesus monkeys were trained to select a small moving target identified by its color in the presence of a moving distractor of another color. Smooth-pursuit eye movements were quantified in terms of the latency of the eye movement and the initial eye acceleration profile. We have previously shown that the latency of smooth pursuit, which is normally around 100 ms, can be extended to 150 ms or shortened to 85 ms depending on whether there is a distractor moving in the opposite or same direction, respectively, relative to the direction of the target. We have now measured this effect for a 360 deg range of distractor directions, and distractor speeds of 5–45 deg/s. We have also examined the effect of varying the spatial separation and temporal asynchrony between target and distractor. The results indicate that the effect of the distractor on the latency of pursuit depends on its direction of motion, and its spatial and temporal proximity to the target, but depends very little on the speed of the distractor. Furthermore, under the conditions of these experiments, the direction of the eye movement that is emitted in response to two competing moving stimuli is not a vectorial combination of the stimulus motions, but is solely determined by the direction of the target. The results are consistent with a competitive model for smooth-pursuit target selection and suggest that the competition takes place at a stage of the pursuit pathway that is between visual-motion processing and motor-response preparation.


2015 ◽  
Vol 113 (5) ◽  
pp. 1377-1399 ◽  
Author(s):  
T. Scott Murdison ◽  
Guillaume Leclercq ◽  
Philippe Lefèvre ◽  
Gunnar Blohm

Smooth pursuit eye movements are driven by retinal motion and enable us to view moving targets with high acuity. Complicating the generation of these movements is the fact that different eye and head rotations can produce different retinal stimuli but giving rise to identical smooth pursuit trajectories. However, because our eyes accurately pursue targets regardless of eye and head orientation (Blohm G, Lefèvre P. J Neurophysiol 104: 2103–2115, 2010), the brain must somehow take these signals into account. To learn about the neural mechanisms potentially underlying this visual-to-motor transformation, we trained a physiologically inspired neural network model to combine two-dimensional (2D) retinal motion signals with three-dimensional (3D) eye and head orientation and velocity signals to generate a spatially correct 3D pursuit command. We then simulated conditions of 1) head roll-induced ocular counterroll, 2) oblique gaze-induced retinal rotations, 3) eccentric gazes (invoking the half-angle rule), and 4) optokinetic nystagmus to investigate how units in the intermediate layers of the network accounted for different 3D constraints. Simultaneously, we simulated electrophysiological recordings (visual and motor tunings) and microstimulation experiments to quantify the reference frames of signals at each processing stage. We found a gradual retinal-to-intermediate-to-spatial feedforward transformation through the hidden layers. Our model is the first to describe the general 3D transformation for smooth pursuit mediated by eye- and head-dependent gain modulation. Based on several testable experimental predictions, our model provides a mechanism by which the brain could perform the 3D visuomotor transformation for smooth pursuit.


2010 ◽  
Vol 104 (4) ◽  
pp. 2103-2115 ◽  
Author(s):  
Gunnar Blohm ◽  
Philippe Lefèvre

Smooth pursuit eye movements are driven by retinal motion signals. These retinal motion signals are converted into motor commands that obey Listing's law (i.e., no accumulation of ocular torsion). The fact that smooth pursuit follows Listing's law is often taken as evidence that no explicit reference frame transformation between the retinal velocity input and the head-centered motor command is required. Such eye-position-dependent reference frame transformations between eye- and head-centered coordinates have been well-described for saccades to static targets. Here we suggest that such an eye (and head)-position-dependent reference frame transformation is also required for target motion (i.e., velocity) driving smooth pursuit eye movements. Therefore we tested smooth pursuit initiation under different three-dimensional eye positions and compared human performance to model simulations. We specifically tested if the ocular rotation axis changed with vertical eye position, if the misalignment of the spatial and retinal axes during oblique fixations was taken into account, and if ocular torsion (due to head roll) was compensated for. If no eye-position-dependent velocity transformation was used, the pursuit initiation should follow the retinal direction, independently of eye position; in contrast, a correct visuomotor velocity transformation would result in spatially correct pursuit initiation. Overall subjects accounted for all three components of the visuomotor velocity transformation, but we did observe differences in the compensatory gains between individual subjects. We concluded that the brain does perform a visuomotor velocity transformation but that this transformation was prone to noise and inaccuracies of the internal model.


2005 ◽  
Vol 93 (6) ◽  
pp. 3418-3433 ◽  
Author(s):  
Hui Meng ◽  
Andrea M. Green ◽  
J. David Dickman ◽  
Dora E. Angelaki

Under natural conditions, the vestibular and pursuit systems work synergistically to stabilize the visual scene during movement. How translational vestibular signals [translational vestibuloocular reflex (TVOR)] are processed in the premotor pathways for slow eye movements continues to remain a challenging question. To further our understanding of how premotor neurons contribute to this processing, we recorded neural activities from the prepositus and rostral medial vestibular nuclei in macaque monkeys. Vestibular neurons were tested during 0.5-Hz rotation and lateral translation (both with gaze stable and during VOR cancellation tasks), as well as during smooth pursuit eye movements. Data were collected at two different viewing distances, 80 and 20 cm. Based on their responses to rotation and pursuit, eye-movement–sensitive neurons were classified into position–vestibular–pause (PVP) neurons, eye–head (EH) neurons, and burst–tonic (BT) cells. We found that approximately half of the type II PVP and EH neurons with ipsilateral eye movement preference were modulated during TVOR cancellation. In contrast, few of the EH and none of the type I PVP cells with contralateral eye movement preference modulated during translation in the absence of eye movements; nor did any of the BT neurons change their firing rates during TVOR cancellation. Of the type II PVP and EH neurons that modulated during TVOR cancellation, cell firing rates increased for either ipsilateral or contralateral displacement, a property that could not be predicted on the basis of their rotational or pursuit responses. In contrast, under stable gaze conditions, all neuron types, including EH cells, were modulated during translation according to their ipsilateral/contralateral preference for pursuit eye movements. Differences in translational response sensitivities for far versus near targets were seen only in type II PVP and EH cells. There was no effect of viewing distance on response phase for any cell type. When expressed relative to motor output, neural sensitivities during translation (although not during rotation) and pursuit were equivalent, particularly for the 20-cm viewing distance. These results suggest that neural activities during the TVOR were more motorlike compared with cell responses during the rotational vestibuloocular reflex (RVOR). We also found that neural responses under stable gaze conditions could not always be predicted by a linear vectorial addition of the cell activities during pursuit and VOR cancellation. The departure from linearity was more pronounced for the TVOR under near-viewing conditions. These results extend previous observations for the neural processing of otolith signals within the premotor circuitry that generates the RVOR and smooth pursuit eye movements.


1986 ◽  
Vol 25 (01) ◽  
pp. 31-34 ◽  
Author(s):  
M. Juhola ◽  
K. Virtanen ◽  
M. Helin ◽  
V. Jäntti ◽  
P. Nurkkanen ◽  
...  

SummaryA visual stimulator system for studies of eye movements has been developed. The system is controlled by an inexpensive microcomputer. It is employed for otoneurological studies both in clinical work and in research, but can also be applied for studies of eye movements in other medical areas. Three types of eye movements are produced, viz. saccadic and smooth pursuit eye movements and optokinetic nystagmus. The stimulator system can be connected to another computer for an analysis of eye movements.


Sign in / Sign up

Export Citation Format

Share Document