Smooth pursuit eye movements in psychiatric inpatients

2017 ◽  
Vol 41 (S1) ◽  
pp. S764-S765
Author(s):  
L. Mandolesi ◽  
G. Piraccini ◽  
F. Ambrosini ◽  
F.L. Vetere ◽  
R.P. Sant’Angelo ◽  
...  

IntroductionEye movements are used in several studies as a biomarker in order to evaluate cortical alterations in psychiatric disorders. Pursuit eye movements’ deficits were found both in schizophrenia and in affective disorder patients. Nevertheless, these findings are still controversial.ObjectivesSet up a system to record and evaluate the eye movements in psychiatric patients.AimsTo verify the applicability of a smooth pursuit task in a sample of psychiatric inpatients and to prove its efficiency in discriminating patient and control group performance.MethodsA sample of psychiatric inpatients was tested at psychiatric service of diagnosis and care of AUSL Romagna-Cesena. Eye movement measures were collected at a sampling rate of 60 Hz using the eye tribe tracker, a bar plugged into a PC, placed below the screen and containing both webcam and infrared illumination. Subjects underwent to a smooth pursuit eye movement task. They had to visually follow a white dot target moving horizontally on a black background with a sinusoidal velocity. At the end of the task, a chart of the eye movements done is shown on the screen. Data are off-line analyzed to calculate several eye movement parameters: gain, eye movement delay with respect to the movement of the target, maximum speed and number of saccades exhibited during pursuit.ResultsPatients compared to controls showed higher delay and lower gain values.ConclusionsFindings confirm the adequacy of this method in order to detect eye movement differences between psychiatric patients and controls in a smooth pursuit task.Disclosure of interestThe authors have not supplied their declaration of competing interest.

2009 ◽  
Vol 101 (2) ◽  
pp. 934-947 ◽  
Author(s):  
Masafumi Ohki ◽  
Hiromasa Kitazawa ◽  
Takahito Hiramatsu ◽  
Kimitake Kaga ◽  
Taiko Kitamura ◽  
...  

The anatomical connection between the frontal eye field and the cerebellar hemispheric lobule VII (H-VII) suggests a potential role of the hemisphere in voluntary eye movement control. To reveal the involvement of the hemisphere in smooth pursuit and saccade control, we made a unilateral lesion around H-VII and examined its effects in three Macaca fuscata that were trained to pursue visually a small target. To the step (3°)-ramp (5–20°/s) target motion, the monkeys usually showed an initial pursuit eye movement at a latency of 80–140 ms and a small catch-up saccade at 140–220 ms that was followed by a postsaccadic pursuit eye movement that roughly matched the ramp target velocity. After unilateral cerebellar hemispheric lesioning, the initial pursuit eye movements were impaired, and the velocities of the postsaccadic pursuit eye movements decreased. The onsets of 5° visually guided saccades to the stationary target were delayed, and their amplitudes showed a tendency of increased trial-to-trial variability but never became hypo- or hypermetric. Similar tendencies were observed in the onsets and amplitudes of catch-up saccades. The adaptation of open-loop smooth pursuit velocity, tested by a step increase in target velocity for a brief period, was impaired. These lesion effects were recognized in all directions, particularly in the ipsiversive direction. A recovery was observed at 4 wk postlesion for some of these lesion effects. These results suggest that the cerebellar hemispheric region around lobule VII is involved in the control of smooth pursuit and saccadic eye movements.


2017 ◽  
Vol 118 (2) ◽  
pp. 986-1001 ◽  
Author(s):  
Ramanujan T. Raghavan ◽  
Stephen G. Lisberger

The midline oculomotor cerebellum plays a different role in smooth pursuit eye movements compared with the lateral, floccular complex and appears to be much less involved in direction learning in pursuit. The output from the oculomotor vermis during pursuit lies along a null-axis for saccades and vice versa. Thus the vermis can play independent roles in the two kinds of eye movement.


1997 ◽  
Vol 14 (2) ◽  
pp. 323-338 ◽  
Author(s):  
Vincent P. Ferrera ◽  
Stephen G. Lisberger

AbstractAs a step toward understanding the mechanism by which targets are selected for smooth-pursuit eye movements, we examined the behavior of the pursuit system when monkeys were presented with two discrete moving visual targets. Two rhesus monkeys were trained to select a small moving target identified by its color in the presence of a moving distractor of another color. Smooth-pursuit eye movements were quantified in terms of the latency of the eye movement and the initial eye acceleration profile. We have previously shown that the latency of smooth pursuit, which is normally around 100 ms, can be extended to 150 ms or shortened to 85 ms depending on whether there is a distractor moving in the opposite or same direction, respectively, relative to the direction of the target. We have now measured this effect for a 360 deg range of distractor directions, and distractor speeds of 5–45 deg/s. We have also examined the effect of varying the spatial separation and temporal asynchrony between target and distractor. The results indicate that the effect of the distractor on the latency of pursuit depends on its direction of motion, and its spatial and temporal proximity to the target, but depends very little on the speed of the distractor. Furthermore, under the conditions of these experiments, the direction of the eye movement that is emitted in response to two competing moving stimuli is not a vectorial combination of the stimulus motions, but is solely determined by the direction of the target. The results are consistent with a competitive model for smooth-pursuit target selection and suggest that the competition takes place at a stage of the pursuit pathway that is between visual-motion processing and motor-response preparation.


2005 ◽  
Vol 93 (6) ◽  
pp. 3418-3433 ◽  
Author(s):  
Hui Meng ◽  
Andrea M. Green ◽  
J. David Dickman ◽  
Dora E. Angelaki

Under natural conditions, the vestibular and pursuit systems work synergistically to stabilize the visual scene during movement. How translational vestibular signals [translational vestibuloocular reflex (TVOR)] are processed in the premotor pathways for slow eye movements continues to remain a challenging question. To further our understanding of how premotor neurons contribute to this processing, we recorded neural activities from the prepositus and rostral medial vestibular nuclei in macaque monkeys. Vestibular neurons were tested during 0.5-Hz rotation and lateral translation (both with gaze stable and during VOR cancellation tasks), as well as during smooth pursuit eye movements. Data were collected at two different viewing distances, 80 and 20 cm. Based on their responses to rotation and pursuit, eye-movement–sensitive neurons were classified into position–vestibular–pause (PVP) neurons, eye–head (EH) neurons, and burst–tonic (BT) cells. We found that approximately half of the type II PVP and EH neurons with ipsilateral eye movement preference were modulated during TVOR cancellation. In contrast, few of the EH and none of the type I PVP cells with contralateral eye movement preference modulated during translation in the absence of eye movements; nor did any of the BT neurons change their firing rates during TVOR cancellation. Of the type II PVP and EH neurons that modulated during TVOR cancellation, cell firing rates increased for either ipsilateral or contralateral displacement, a property that could not be predicted on the basis of their rotational or pursuit responses. In contrast, under stable gaze conditions, all neuron types, including EH cells, were modulated during translation according to their ipsilateral/contralateral preference for pursuit eye movements. Differences in translational response sensitivities for far versus near targets were seen only in type II PVP and EH cells. There was no effect of viewing distance on response phase for any cell type. When expressed relative to motor output, neural sensitivities during translation (although not during rotation) and pursuit were equivalent, particularly for the 20-cm viewing distance. These results suggest that neural activities during the TVOR were more motorlike compared with cell responses during the rotational vestibuloocular reflex (RVOR). We also found that neural responses under stable gaze conditions could not always be predicted by a linear vectorial addition of the cell activities during pursuit and VOR cancellation. The departure from linearity was more pronounced for the TVOR under near-viewing conditions. These results extend previous observations for the neural processing of otolith signals within the premotor circuitry that generates the RVOR and smooth pursuit eye movements.


1986 ◽  
Vol 25 (01) ◽  
pp. 31-34 ◽  
Author(s):  
M. Juhola ◽  
K. Virtanen ◽  
M. Helin ◽  
V. Jäntti ◽  
P. Nurkkanen ◽  
...  

SummaryA visual stimulator system for studies of eye movements has been developed. The system is controlled by an inexpensive microcomputer. It is employed for otoneurological studies both in clinical work and in research, but can also be applied for studies of eye movements in other medical areas. Three types of eye movements are produced, viz. saccadic and smooth pursuit eye movements and optokinetic nystagmus. The stimulator system can be connected to another computer for an analysis of eye movements.


Perception ◽  
10.1068/p3411 ◽  
2002 ◽  
Vol 31 (10) ◽  
pp. 1195-1203 ◽  
Author(s):  
Gerben Rotman ◽  
Eli Brenner ◽  
Jeroen B J Smeets

Human subjects misjudge the position of a target that is flashed during a pursuit eye movement. Their judgments are biased in the direction in which the eyes are moving. We investigated whether this bias can be reduced by making the appearance of the flash more predictable. In the normal condition, subjects pursued a moving target that flashed somewhere along its trajectory. After the presentation, they indicated where they had seen the flash. The mislocalisations in this condition were compared to mislocalisations in conditions in which the subjects were given information about when or where the flash would come. This information consisted of giving two warning flashes spaced at equal intervals before the target flash, of giving two warning beeps spaced at equal intervals before the target flash, or of showing the same stimulus twice. Showing the same stimulus twice significantly reduced the mislocalisation. The other conditions did not. We interpret this as indicating that it is not predictability as such that influences the performance, but the fact that the target appears at a spatially cued position. This was supported by a second experiment, in which we examined whether subjects make smaller mis-judgments when they have to determine the distance between a target flashed during pursuit and a reference seen previously, than when they have to determine the distance between the flashed target and a reference seen afterwards. This was indeed the case, presumably because the reference provided a spatial cue for the flash when it was presented first. We conclude that a spatial cue reduces the mislocalisation of targets that are flashed during pursuit eye movements. The cue does not have to be exactly at the same position as the flash.


1989 ◽  
Vol 61 (6) ◽  
pp. 1207-1220 ◽  
Author(s):  
M. J. Mustari ◽  
A. F. Fuchs

1. To determine the potential role of the primate accessory optic system (AOS) in optokinetic and smooth-pursuit eye movements, we recorded the activity of 110 single units in a subdivision of the AOS, the lateral terminal nucleus (LTN), in five alert rhesus macaques. All monkeys were trained to fixate a stationary target spot during visual testing and to track a small spot moving in a variety of visual environments. 2. LTN units formed a continuum of types ranging from purely visual to purely oculomotor. Visual units (50%) responded best for large-field (70 x 50 degrees), moving visual stimuli and had no response associated with smooth-pursuit eye movement; some responded during smooth pursuit in the dark, but the response disappeared if the target was briefly extinguished, indicating that their smooth-pursuit-related response reflected activation of a parafoveal receptive field. Eye movement and visual units (36%) responded both for large, moving visual stimuli and during smooth-pursuit eye movements made in the dark. Eye movement units (14%) discharged during smooth-pursuit or other eye movements but showed no evidence of visual sensitivity. 3. Essentially all (98%) LTN units were direction selective, responding preferentially during vertical background and/or smooth-pursuit movement. The vast majority (88%) preferred upward background and/or eye movement. During periodic movement of the large-field visual background while the animal fixated, their firing rates were modulated above and below rather high resting rates. Although LTN units typically responded best to movement of large-field stimuli, some also responded well to small moving stimuli (0.25 degrees diam). 4. LTN units could be separated into two populations according to their dependence on visual stimulus velocity. For periodic triangle wave stimuli, both types had velocity thresholds less than 3 degrees/s. As stimulus velocity increased above threshold, the activity of one type reached peak firing rates over a very narrow velocity range and remained nearly at peak firing for velocities from approximately 4-80 degrees/s. The firing rates of the other type exhibited velocity tuning in which the firing rate peaked at an average preferred velocity of 13 degrees/s and decreased for higher velocities. 5. A close examination of firing rates to sinusoidal background stimuli revealed that both unit types exhibited unusual behaviors at the extremes of stimulus velocity.(ABSTRACT TRUNCATED AT 400 WORDS)


2001 ◽  
Vol 85 (5) ◽  
pp. 1914-1922 ◽  
Author(s):  
Robert J. van Beers ◽  
Daniel M. Wolpert ◽  
Patrick Haggard

To localize a seen object, the CNS has to integrate the object's retinal location with the direction of gaze. Here we investigate this process by examining the localization of static objects during smooth pursuit eye movements. The normally experienced stability of the visual world during smooth pursuit suggests that the CNS essentially compensates for the eye movement when judging target locations. However, certain systematic localization errors are made, and we use these to study the process of sensorimotor integration. During an eye movement, a static object's image moves across the retina. Objects that produce retinal slip are known to be mislocalized: objects moving toward the fovea are seen too far on in their trajectory, whereas errors are much smaller for objects moving away from the fovea. These effects are usually studied by localizing the moving object relative to a briefly flashed one during fixation: moving objects are then mislocalized, but flashes are not. In our first experiment, we found that a similar differential mislocalization occurs for static objects relative to flashes during pursuit. This effect is not specific for horizontal pursuit but was also found in other directions. In a second experiment, we examined how this effect generalizes to positions outside the line of eye movement. We found that large localization errors were found in the entire hemifield ahead of the pursuit target and were predominantly aligned with the direction of eye movement. In a third experiment, we determined whether it is the flash or the static object that is mislocalized ahead of the pursuit target. In contrast to fixation conditions, we found that during pursuit it is the flash, not the static object, which is mislocalized. In a fourth experiment, we used egocentric localization to confirm this result. Our results suggest that the CNS compensates for the retinal localization errors to maintain position constancy for static objects during pursuit. This compensation is achieved in the process of sensorimotor integration of retinal and gaze signals: different retinal areas are integrated with different gaze signals to guarantee the stability of the visual world.


2020 ◽  
Author(s):  
Xiuyun Wu ◽  
Austin C. Rothwell ◽  
Miriam Spering ◽  
Anna Montagnini

AbstractSmooth pursuit eye movements and visual motion perception rely on the integration of current sensory signals with past experience. Experience shapes our expectation of current visual events and can drive eye movement responses made in anticipation of a target, such as anticipatory pursuit. Previous research revealed consistent effects of expectation on anticipatory pursuit—eye movements follow the expected target direction or speed—and contrasting effects on motion perception, but most studies considered either eye movement or perceptual responses. The current study directly compared effects of direction expectation on perception and anticipatory pursuit within the same direction discrimination task to investigate whether both types of responses are affected similarly or differently. Observers (n = 10) viewed high-coherence random-dot kinematograms (RDKs) moving rightward and leftward with a probability of 50, 70, or 90% in a given block of trials to build up an expectation of motion direction. They were asked to judge motion direction of interleaved low-coherence RDKs (0-15%). Perceptual judgements were compared to changes in anticipatory pursuit eye movements as a function of probability. Results show that anticipatory pursuit velocity scaled with probability and followed direction expectation (attraction bias), whereas perceptual judgments were biased opposite to direction expectation (repulsion bias). Control experiments suggest that the repulsion bias in perception was not caused by retinal slip induced by anticipatory pursuit, or by motion adaptation. We conclude that direction expectation can be processed differently for perception and anticipatory pursuit.


Sign in / Sign up

Export Citation Format

Share Document