Torsional eye movements are facilitated during perception of self-motion

1999 ◽  
Vol 126 (4) ◽  
pp. 495-500 ◽  
Author(s):  
K. V. Thilo ◽  
Thomas Probst ◽  
Adolfo M. Bronstein ◽  
Yatsuji Ito ◽  
Michael A. Gresty
2005 ◽  
Vol 6 (12) ◽  
pp. 966-976 ◽  
Author(s):  
Dora E. Angelaki ◽  
Bernhard J. M. Hess

1998 ◽  
Vol 79 (3) ◽  
pp. 1461-1480 ◽  
Author(s):  
Markus Lappe ◽  
Martin Pekel ◽  
Klaus-Peter Hoffmann

Lappe, Markus, Martin Pekel, and Klaus-Peter Hoffmann. Optokinetic eye movements elicited by radial optic flow in the macaque monkey. J. Neurophysiol. 79: 1461–1480, 1998. We recorded spontaneous eye movements elicited by radial optic flow in three macaque monkeys using the scleral search coil technique. Computer-generated stimuli simulated forward or backward motion of the monkey with respect to a number of small illuminated dots arranged on a virtual ground plane. We wanted to see whether optokinetic eye movements are induced by radial optic flow stimuli that simulate self-movement, quantify their parameters, and consider their effects on the processing of optic flow. A regular pattern of interchanging fast and slow eye movements with a frequency of 2 Hz was observed. When we shifted the horizontal position of the focus of expansion (FOE) during simulated forward motion (expansional optic flow), median horizontal eye position also shifted in the same direction but only by a smaller amount; for simulated backward motion (contractional optic flow), median eye position shifted in the opposite direction. We relate this to a change in Schlagfeld typically observed in optokinetic nystagmus. Direction and speed of slow phase eye movements were compared with the local flow field motion in gaze direction (the foveal flow). Eye movement direction matched well the foveal motion. Small systematic deviations could be attributed to an integration of the global motion pattern. Eye speed on average did not match foveal stimulus speed, as the median gain was only ∼0.5–0.6. The gain was always lower for expanding than for contracting stimuli. We analyzed the time course of the eye movement immediately after each saccade. We found remarkable differences in the initial development of gain and directional following for expansion and contraction. For expansion, directional following and gain were initially poor and strongly influenced by the ongoing eye movement before the saccade. This was not the case for contraction. These differences also can be linked to properties of the optokinetic system. We conclude that optokinetic eye movements can be elicited by radial optic flow fields simulating self-motion. These eye movements are linked to the parafoveal flow field, i.e., the motion in the direction of gaze. In the retinal projection of the optic flow, such eye movements superimpose retinal slip. This results in complex retinal motion patterns, especially because the gain of the eye movement is small and variable. This observation has special relevance for mechanisms that determine self-motion from retinal flow fields. It is necessary to consider the influence of eye movements in optic flow analysis, but our results suggest that direction and speed of an eye movement should be treated differently.


1998 ◽  
Vol 87 (2) ◽  
pp. 667-672 ◽  
Author(s):  
Shinji Nakamura ◽  
Shinsuke Shimojo

We examined the effect of body posture upon visually induced perception of self-motion (vection) with various angles of observer's tilt. The experiment indicated that the tilted body of observer could enhance perceived strength of vertical vection, while there was no effect of body tilt on horizontal vection. This result suggests that there is an interaction between the effects of visual and vestibular information on perception of self-motion.


Author(s):  
Luc Tremblay ◽  
Andrew Kennedy ◽  
Dany Paleressompoulle ◽  
Liliane Borel ◽  
Laurence Mouchnino ◽  
...  

2006 ◽  
Vol 9 (2) ◽  
pp. 163-166 ◽  
Author(s):  
E.A. Keshner ◽  
K. Dokka ◽  
R.V. Kenyon

Author(s):  
Kathleen E. Cullen

As we go about our everyday activities, our brain computes accurate estimates of both our motion relative to the world, and of our orientation relative to gravity. Essential to this computation is the information provided by the vestibular system; it detects the rotational velocity and linear acceleration of our heads relative to space, making a fundamental contribution to our perception of self-motion and spatial orientation. Additionally, in everyday life, our perception of self-motion depends on the integration of both vestibular and nonvestibular cues, including visual and proprioceptive information. Furthermore, the integration of motor-related information is also required for perceptual stability, so that the brain can distinguish whether the experienced sensory inflow was a result of active self-motion through the world or if instead self-motion that was externally generated. To date, understanding how the brain encodes and integrates sensory cues with motor signals for the perception of self-motion during natural behaviors remains a major goal in neuroscience. Recent experiments have (i) provided new insights into the neural code used to represent sensory information in vestibular pathways, (ii) established that vestibular pathways are inherently multimodal at the earliest stages of processing, and (iii) revealed that self-motion information processing is adjusted to meet the needs of specific tasks. Our current level of understanding of how the brain integrates sensory information and motor-related signals to encode self-motion and ensure perceptual stability during everyday activities is reviewed.


2014 ◽  
Vol 112 (10) ◽  
pp. 2470-2480 ◽  
Author(s):  
Andre Kaminiarz ◽  
Anja Schlack ◽  
Klaus-Peter Hoffmann ◽  
Markus Lappe ◽  
Frank Bremmer

The patterns of optic flow seen during self-motion can be used to determine the direction of one's own heading. Tracking eye movements which typically occur during everyday life alter this task since they add further retinal image motion and (predictably) distort the retinal flow pattern. Humans employ both visual and nonvisual (extraretinal) information to solve a heading task in such case. Likewise, it has been shown that neurons in the monkey medial superior temporal area (area MST) use both signals during the processing of self-motion information. In this article we report that neurons in the macaque ventral intraparietal area (area VIP) use visual information derived from the distorted flow patterns to encode heading during (simulated) eye movements. We recorded responses of VIP neurons to simple radial flow fields and to distorted flow fields that simulated self-motion plus eye movements. In 59% of the cases, cell responses compensated for the distortion and kept the same heading selectivity irrespective of different simulated eye movements. In addition, response modulations during real compared with simulated eye movements were smaller, being consistent with reafferent signaling involved in the processing of the visual consequences of eye movements in area VIP. We conclude that the motion selectivities found in area VIP, like those in area MST, provide a way to successfully analyze and use flow fields during self-motion and simultaneous tracking movements.


Sign in / Sign up

Export Citation Format

Share Document