Three-dimensional Hall effect accelerometer for recording head movements of freely moving laboratory animals

1991 ◽  
Vol 49 (3) ◽  
pp. 651-652 ◽  
Author(s):  
Tapani Korhonen
2018 ◽  
Vol 20 (8) ◽  
pp. 083034 ◽  
Author(s):  
Christian Kern ◽  
Graeme W Milton ◽  
Muamer Kadic ◽  
Martin Wegener

2018 ◽  
Vol 382 (44) ◽  
pp. 3205-3210
Author(s):  
Zhi-Peng Gao ◽  
Zhi Li ◽  
Dan-Wei Zhang

Author(s):  
Angie M. Michaiel ◽  
Elliott T.T. Abe ◽  
Cristopher M. Niell

ABSTRACTMany studies of visual processing are conducted in unnatural conditions, such as head- and gaze-fixation. As this radically limits natural exploration of the visual environment, there is much less known about how animals actively use their sensory systems to acquire visual information in natural, goal-directed contexts. Recently, prey capture has emerged as an ethologically relevant behavior that mice perform without training, and that engages vision for accurate orienting and pursuit. However, it is unclear how mice target their gaze during such natural behaviors, particularly since, in contrast to many predatory species, mice have a narrow binocular field and lack foveate vision that would entail fixing their gaze on a specific point in the visual field. Here we measured head and bilateral eye movements in freely moving mice performing prey capture. We find that the majority of eye movements are compensatory for head movements, thereby acting to stabilize the visual scene. During head turns, however, these periods of stabilization are interspersed with non-compensatory saccades that abruptly shift gaze position. Analysis of eye movements relative to the cricket position shows that the saccades do not preferentially select a specific point in the visual scene. Rather, orienting movements are driven by the head, with the eyes following in coordination to sequentially stabilize and recenter the gaze. These findings help relate eye movements in the mouse to other species, and provide a foundation for studying active vision during ethological behaviors in the mouse.


1997 ◽  
Vol 77 (2) ◽  
pp. 654-666 ◽  
Author(s):  
Douglas Tweed

Tweed, Douglas. Three-dimensional model of the human eye-head saccadic system. J. Neurophysiol. 77: 654–666, 1997. Current theories of eye-head gaze shifts deal only with one-dimensional motion, and do not touch on three-dimensional (3-D) issues such as curvature and Donders' laws. I show that recent 3-D data can be explained by a model based on ideas that are well established from one-dimensional studies, with just one new assumption: that the eye is driven toward a 3-D orientation in space that has been chosen so that Listing's law of the eye in head will hold when the eye-head movement is complete. As in previous, one-dimensional models, the eye and head are feedback-guided and the commands specifying desired eye position eye pass through a neural “saturation” so as to stay within the effective oculomotor range. The model correctly predicts the complex, 3-D trajectories of the head, eye in space, and eye in head in a variety of saccade tasks. And when it moves repeatedly to the same target, varying the contributions of eye and head, the model lands in different eye-in-space positions, but these positions differ only in their cyclotorsion about the line of sight, so they all point that line at the target—a behavior also seen in real eye-head saccades. Between movements the model obeys Listing's law of the eye in head and Donders' law of the head on torso, but during certain gaze shifts involving large torsional head movements, it shows marked, 8° deviations from Listing's law. These deviations are the most important untested predictions of the theory. Their experimental refutation would sink the model, whereas confirmation would strongly support its central claim that the eye moves toward a 3-D position in space chosen to obey Listing's law and, therefore, that a Listing operator exists upstream from the eye pulse generator.


JETP Letters ◽  
2019 ◽  
Vol 109 (11) ◽  
pp. 715-721 ◽  
Author(s):  
O. O. Shvetsov ◽  
V. D. Esin ◽  
A. V. Timonina ◽  
N. N. Kolesnikov ◽  
E. V. Deviatov

2017 ◽  
Vol 7 (1) ◽  
pp. 20160093 ◽  
Author(s):  
Ivo G. Ros ◽  
Partha S. Bhagavatula ◽  
Huai-Ti Lin ◽  
Andrew A. Biewener

Flying animals must successfully contend with obstacles in their natural environments. Inspired by the robust manoeuvring abilities of flying animals, unmanned aerial systems are being developed and tested to improve flight control through cluttered environments. We previously examined steering strategies that pigeons adopt to fly through an array of vertical obstacles (VOs). Modelling VO flight guidance revealed that pigeons steer towards larger visual gaps when making fast steering decisions. In the present experiments, we recorded three-dimensional flight kinematics of pigeons as they flew through randomized arrays of horizontal obstacles (HOs). We found that pigeons still decelerated upon approach but flew faster through a denser array of HOs compared with the VO array previously tested. Pigeons exhibited limited steering and chose gaps between obstacles most aligned to their immediate flight direction, in contrast to VO navigation that favoured widest gap steering. In addition, pigeons navigated past the HOs with more variable and decreased wing stroke span and adjusted their wing stroke plane to reduce contact with the obstacles. Variability in wing extension, stroke plane and wing stroke path was greater during HO flight. Pigeons also exhibited pronounced head movements when negotiating HOs, which potentially serve a visual function. These head-bobbing-like movements were most pronounced in the horizontal (flight direction) and vertical directions, consistent with engaging motion vision mechanisms for obstacle detection. These results show that pigeons exhibit a keen kinesthetic sense of their body and wings in relation to obstacles. Together with aerodynamic flapping flight mechanics that favours vertical manoeuvring, pigeons are able to navigate HOs using simple rules, with remarkable success.


Sign in / Sign up

Export Citation Format

Share Document