Dynamic Control of Primary Eye Position as a Function of Head Orientation Relative to Gravity

Author(s):  
Bernhard J.M. Hess ◽  
Dora E. Angelaki
1999 ◽  
Vol 81 (1) ◽  
pp. 394-398 ◽  
Author(s):  
Bernhard J. M. Hess ◽  
Dora E. Angelaki

Hess, Bernhard J. M. and Dora E. Angelaki. Oculomotor control of primary eye position discriminates between translation and tilt. J. Neurophysiol. 81: 394–398, 1999. We have previously shown that fast phase axis orientation and primary eye position in rhesus monkeys are dynamically controlled by otolith signals during head rotations that involve a reorientation of the head relative to gravity. Because of the inherent ambiguity associated with primary otolith afferent coding of linear accelerations during head translation and tilts, a similar organization might also underlie the vestibulo-ocular reflex (VOR) during translation. The ability of the oculomotor system to correctly distinguish translational accelerations from gravity in the dynamic control of primary eye position has been investigated here by comparing the eye movements elicited by sinusoidal lateral and fore-aft oscillations (0.5 Hz ± 40 cm, equivalent to ± 0.4 g) with those during yaw rotations (180°/s) about a vertically tilted axis (23.6°). We found a significant modulation of primary eye position as a function of linear acceleration (gravity) during rotation but not during lateral and fore-aft translation. This modulation was enhanced during the initial phase of rotation when there was concomitant semicircular canal input. These findings suggest that control of primary eye position and fast phase axis orientation in the VOR are based on central vestibular mechanisms that discriminate between gravity and translational head acceleration.


1997 ◽  
Vol 78 (4) ◽  
pp. 2203-2216 ◽  
Author(s):  
Bernhard J. M. Hess ◽  
Dora E. Angelaki

Hess, Bernhard J. M. and Dora E. Angelaki. Kinematic principles of primate rotational vestibulo-ocular reflex. II. Gravity-dependent modulation of primary eye position. J. Neurophysiol. 78: 2203–2216, 1997. The kinematic constraints of three-dimensional eye positions were investigated in rhesus monkeys during passive head and body rotations relative to gravity. We studied fast and slow phase components of the vestibulo-ocular reflex (VOR) elicited by constant-velocity yaw rotations and sinusoidal oscillations about an earth-horizontal axis. We found that the spatial orientation of both fast and slow phase eye positions could be described locally by a planar surface with torsional variation of <2.0 ± 0.4° (displacement planes) that systematically rotated and/or shifted relative to Listing's plane. In supine/prone positions, displacement planes pitched forward/backward; in left/right ear-down positions, displacement planes were parallel shifted along the positive/negative torsional axis. Dynamically changing primary eye positions were computed from displacement planes. Torsional and vertical components of primary eye position modulated as a sinusoidal function of head orientation in space. The torsional component was maximal in ear-down positions and approximately zero in supine/prone orientations. The opposite was observed for the vertical component. Modulation of the horizontal component of primary eye position exhibited a more complex dependence. In contrast to the torsional component, which was relatively independent of rotational speed, modulation of the vertical and horizontal components of primary position depended strongly on the speed of head rotation (i.e., on the frequency of oscillation of the gravity vector component): the faster the head rotated relative to gravity, the larger was the modulation. Corresponding results were obtained when a model based on a sinusoidal dependence of instantaneous displacement planes (and primary eye position) on head orientation relative to gravity was fitted to VOR fast phase positions. When VOR fast phase positions were expressed relative to primary eye position estimated from the model fits, they were confined approximately to a single plane with a small torsional standard deviation (∼1.4–2.6°). This reduced torsional variation was in contrast to the large torsional spread (well >10–15°) of fast phase positions when expressed relative to Listing's plane. We conclude that primary eye position depends dynamically on head orientation relative to space rather than being fixed to the head. It defines a gravity-dependent coordinate system relative to which the torsional variability of eye positions is minimized even when the head is moved passively and vestibulo-ocular reflexes are evoked. In this general sense, Listing's law is preserved with respect to an otolith-controlled reference system that is defined dynamically by gravity.


i-Perception ◽  
10.1068/if719 ◽  
2012 ◽  
Vol 3 (9) ◽  
pp. 719-719 ◽  
Author(s):  
Ryoichi Nakashima ◽  
Yu Fang ◽  
Kazumichi Matsumiya ◽  
Rumi Tokunaga ◽  
Ichiro Kuriki ◽  
...  

2015 ◽  
Vol 3 (2) ◽  
pp. 149-154 ◽  
Author(s):  
Yu Fang ◽  
Masaki Emoto ◽  
Ryoichi Nakashima ◽  
Kazumichi Matsumiya ◽  
Ichiro Kuriki ◽  
...  

Perception ◽  
1997 ◽  
Vol 26 (1_suppl) ◽  
pp. 237-237
Author(s):  
J Li ◽  
M M Cohen ◽  
C W DeRoshia ◽  
L T Guzy

Perceived eye position and/or the perceived location of visual targets are altered when the orientation of the surrounding visual environment (Cohen et al, 1995 Perception & Psychophysics571 433) or that of the observer (Cohen and Guzy, 1995 Aviation, Space, and Environmental Medicine66 505) is changed. Fourteen subjects used biteboards as they lay on a rotary bed that was oriented head-down −15°, −7.5°, supine, head-up +7.5°, and +15°. In the dark, subjects directed their gaze and set a target to the apparent zenith (exocentric location); they also gazed at a subjective ‘straight ahead’ position with respect to their head (egocentric location). Angular deviations of target settings and changes in vertical eye position were recorded with an ISCAN infrared tracking system. Results indicated that, for exocentric locations, the eyes deviate systematically from the true zenith. The gain for compensating changes in head orientation was 0.69 and 0.73 for gaze direction and target settings, respectively. In contrast, ‘straight ahead’ eye positions were not significantly affected by changes in the subject's orientation. We conclude that subjects make systematic errors when directing their gaze to an exocentric location in near-supine positions. This suggests a systematic bias in the integration of extra-ocular signals with information regarding head orientation. The bias may result from underestimating changes in the orientation of the head in space. In contrast, for egocentric locations, where head orientation information can potentially be discarded, gaze directions were unaffected by head orientation near supine.


2003 ◽  
Vol 12 (5-6) ◽  
pp. 211-221 ◽  
Author(s):  
Mark Shelhamer ◽  
Richard A. Clendaniel ◽  
Dale C. Roberts

Previous studies established that vestibular reflexes can have two adapted states (e.g., gains) simultaneously, and that a context cue (e.g., vertical eye position) can switch between the two states. Our earlier work demonstrated this phenomenon of context-specific adaptation for saccadic eye movements: we asked for gain decrease in one context state and gain increase in another context state, and then determined if a change in the context state would invoke switching between the adapted states. Horizontal and vertical eye position and head orientation could serve, to varying degrees, as cues for switching between two different saccade gains. In the present study, we asked whether gravity magnitude could serve as a context cue: saccade adaptation was performed during parabolic flight, which provides alternating levels of gravitoinertial force (0 g and 1.8 g). Results were less robust than those from ground experiments, but established that different saccade magnitudes could be associated with different gravity levels.


2012 ◽  
Vol 12 (9) ◽  
pp. 1248-1248 ◽  
Author(s):  
Y. Fang ◽  
R. Nakashima ◽  
K. Matsumiya ◽  
R. Tokunaga ◽  
I. Kuriki ◽  
...  

2020 ◽  
Author(s):  
Benjamin Cellini ◽  
Wael Salem ◽  
Jean-Michel Mongeau

ABSTRACTTo guide locomotion, animals control their gaze via movements of their eyes, head, and/or body, but how the nervous system controls gaze during complex motor tasks remains elusive. Notably, eye movements are constrained by anatomical limits, which requires resetting eye position. By studying tethered, flying flies (Drosophila) in a virtual reality flight simulator, we show that ballistic head movements (saccades) reset eye position, are stereotyped and leverage elastic recoil of the neck joint, enabling mechanically assisted redirection of gaze. Head reset saccades were of proprioceptive origin and interrupted smooth movements for as little as 50 ms, enabling punctuated, near-continuous gaze stabilization. Wing saccades were modulated by head orientation, establishing a causal link between neck signals and execution of body saccades. Furthermore, we demonstrate that head movements are gated by behavioral state. We propose a control architecture for biological and bio-inspired active vision systems with limits in sensor range of motion.


2018 ◽  
Author(s):  
Arne F. Meyer ◽  
Jasper Poort ◽  
John O’Keefe ◽  
Maneesh Sahani ◽  
Jennifer F. Linden

SummaryBreakthroughs in understanding the neural basis of natural behavior require neural recording and intervention to be paired with high-fidelity multimodal behavioral monitoring. An extensive genetic toolkit for neural circuit dissection, and well-developed neural recording technology, make the mouse a powerful model organism for systems neuroscience. However, methods for high-bandwidth acquisition of behavioral signals in mice remain limited to fixed-position cameras and other off-animal devices, complicating the monitoring of animals freely engaged in natural behaviors. Here, we report the development of an ultralight head-mounted camera system combined with head-movement sensors to simultaneously monitor eye position, pupil dilation, whisking, and pinna movements along with head motion in unrestrained, freely behaving mice. The power of the combined technology is demonstrated by observations linking eye position to head orientation; whisking to non-tactile stimulation; and, in electrophysiological experiments, visual cortical activity to volitional head movements.


Sign in / Sign up

Export Citation Format

Share Document