How the Brain Moves the Eyes

Author(s):  
Shirley H. Wray

discusses the brain’s visual architecture for directing and controlling of eye movements:the striate, frontal and parietal cortical areas; and the eye movements themselves—saccades, smooth pursuit, and vergence. The susceptibility to disorders of these systems is illustrated in four detailed cases that follow disease progression from initial symptoms and signs to diagnosis and treatment. The case studies and video displays include a patient with Pick’s disease (frontotemporal dementia), another with Alzheimer’s dementia, and two examples of rare saccadic syndromes, one a patient with the slow saccade syndrome due to progressive supranuclear palsy and one with selective saccadic palsy following cardiac surgery.

2004 ◽  
Vol 91 (2) ◽  
pp. 591-603 ◽  
Author(s):  
Richard J. Krauzlis

Primates use a combination of smooth pursuit and saccadic eye movements to stabilize the retinal image of selected objects within the high-acuity region near the fovea. Pursuit has traditionally been viewed as a relatively automatic behavior, driven by visual motion signals and mediated by pathways that connect visual areas in the cerebral cortex to motor regions in the cerebellum. However, recent findings indicate that this view needs to be reconsidered. Rather than being controlled primarily by areas in extrastriate cortex specialized for processing visual motion, pursuit involves an extended network of cortical areas, and, of these, the pursuit-related region in the frontal eye fields appears to exert the most direct influence. The traditional pathways through the cerebellum are important, but there are also newly identified routes involving structures previously associated with the control of saccades, including the basal ganglia, the superior colliculus, and nuclei in the brain stem reticular formation. These recent findings suggest that the pursuit system has a functional architecture very similar to that of the saccadic system. This viewpoint provides a new perspective on the processing steps that occur as descending control signals interact with circuits in the brain stem and cerebellum responsible for gating and executing voluntary eye movements. Although the traditional view describes pursuit and saccades as two distinct neural systems, it may be more accurate to consider the two movements as different outcomes from a shared cascade of sensory–motor functions.


2008 ◽  
Vol 29 (3) ◽  
pp. 300-311 ◽  
Author(s):  
Maja U. Trenner ◽  
Manfred Fahle ◽  
Oliver Fasold ◽  
Hauke R. Heekeren ◽  
Arno Villringer ◽  
...  

2015 ◽  
Vol 113 (5) ◽  
pp. 1377-1399 ◽  
Author(s):  
T. Scott Murdison ◽  
Guillaume Leclercq ◽  
Philippe Lefèvre ◽  
Gunnar Blohm

Smooth pursuit eye movements are driven by retinal motion and enable us to view moving targets with high acuity. Complicating the generation of these movements is the fact that different eye and head rotations can produce different retinal stimuli but giving rise to identical smooth pursuit trajectories. However, because our eyes accurately pursue targets regardless of eye and head orientation (Blohm G, Lefèvre P. J Neurophysiol 104: 2103–2115, 2010), the brain must somehow take these signals into account. To learn about the neural mechanisms potentially underlying this visual-to-motor transformation, we trained a physiologically inspired neural network model to combine two-dimensional (2D) retinal motion signals with three-dimensional (3D) eye and head orientation and velocity signals to generate a spatially correct 3D pursuit command. We then simulated conditions of 1) head roll-induced ocular counterroll, 2) oblique gaze-induced retinal rotations, 3) eccentric gazes (invoking the half-angle rule), and 4) optokinetic nystagmus to investigate how units in the intermediate layers of the network accounted for different 3D constraints. Simultaneously, we simulated electrophysiological recordings (visual and motor tunings) and microstimulation experiments to quantify the reference frames of signals at each processing stage. We found a gradual retinal-to-intermediate-to-spatial feedforward transformation through the hidden layers. Our model is the first to describe the general 3D transformation for smooth pursuit mediated by eye- and head-dependent gain modulation. Based on several testable experimental predictions, our model provides a mechanism by which the brain could perform the 3D visuomotor transformation for smooth pursuit.


2002 ◽  
Vol 87 (6) ◽  
pp. 2700-2714 ◽  
Author(s):  
Masaki Tanaka ◽  
Stephen G. Lisberger

When monkeys view two targets moving in different directions and are given no cues about which to track, the initiation of smooth pursuit is a vector average of the response evoked by each target singly. In the present experiments, double-target stimuli consisted of two identical targets moving in opposite directions along the preferred axis of pursuit for the neuron under study for 200 ms, followed by the continued motion for 800 ms of one target chosen randomly. Among the neurons that showed directional modulation during pursuit, recordings revealed three groups. The majority (32/60) showed responses that were intermediate to, and statistically different from, the responses to either target presented alone. Another large group (22/60) showed activity that was statistically indistinguishable from the response to the target moving in the preferred ( n = 15) or opposite ( n = 7) direction of the neuron under study. The minority (6/60) showed statistically higher firing during averaging pursuit than for either target presented singly. We conclude that many pursuit-related neurons in the frontal pursuit area (FPA) carry signals related to the motor output during averaging pursuit, while others encode the motion of one target or the other. Microstimulation with 200-ms trains of pulses at 50 μA while monkeys performed the same double-target tasks biased the averaging eye velocity in the direction of evoked eye movements during fixation. The effect of stimulation was compared with the predictions of three different models that placed the site of vector averaging upstream from, at, or downstream from the sites where the FPA regulates the gain of pursuit. The data were most consistent with a site for pursuit averaging downstream from the gain control, both for double-target stimuli that presented motion in opposite directions and in orthogonal directions. Thus the recording and stimulation data suggest that the FPA is both downstream and upstream from the sites of vector averaging. We resolve this paradox by suggesting that the site of averaging is really downstream from the site of gain control, while feedback of the eye velocity command from the brain stem and/or cerebellum is responsible for the firing of FPA neurons in relation to the averaged eye velocity. We suggest that eye velocity feedback allows FPA neurons to continue firing during accurate tracking, when image motion is small, and that the persistent output from the FPA is necessary to keep the internal gain of pursuit high and permit accurate pursuit.


Author(s):  
Joshua May

Empirical research apparently suggests that emotions play an integral role in moral judgment. The evidence for sentimentalism is diverse, but it is rather weak and has generally been overblown. There is no evidence that our moral concepts themselves are partly composed of or necessarily dependent on emotions. While the moral/conventional distinction may partly characterize the essence of moral judgment, moral norms needn’t be backed by affect in order to transcend convention. Priming people with incidental emotions like disgust doesn’t make them moralize actions. Finally, moral judgment can only be somewhat impaired by damage to areas of the brain that are generally associated with emotional processing (as in acquired sociopathy and frontotemporal dementia). While psychopaths exhibit both emotional and rational deficits, the latter alone can explain any minor defects in moral cognition.


Sign in / Sign up

Export Citation Format

Share Document