scholarly journals Voluntary Head Movement and Allocentric Perception of Space

2003 ◽  
Vol 14 (4) ◽  
pp. 340-346 ◽  
Author(s):  
Mark Wexler

Although visual input is egocentric, at least some visual perceptions and representations are allocentric, that is, independent of the observer's vantage point or motion. Three experiments investigated the visual perception of three-dimensional object motion during voluntary and involuntary motion in human subjects. The results show that the motor command contributes to the objective perception of space: Observers are more likely to apply, consciously and unconsciously, spatial criteria relative to an allocentric frame of reference when they are executing voluntary head movements than while they are undergoing similar involuntary displacements (which lead to a more egocentric bias). Furthermore, details of the motor command are crucial to spatial vision, as allocentric bias decreases or disappears when self-motion and motor command do not match.

1995 ◽  
Vol 73 (2) ◽  
pp. 766-779 ◽  
Author(s):  
D. Tweed ◽  
B. Glenn ◽  
T. Vilis

1. Three-dimensional (3D) eye and head rotations were measured with the use of the magnetic search coil technique in six healthy human subjects as they made large gaze shifts. The aims of this study were 1) to see whether the kinematic rules that constrain eye and head orientations to two degrees of freedom between saccades also hold during movements; 2) to chart the curvature and looping in eye and head trajectories; and 3) to assess whether the timing and paths of eye and head movements are more compatible with a single gaze error command driving both movements, or with two different feedback loops. 2. Static orientations of the eye and head relative to space are known to resemble the distribution that would be generated by a Fick gimbal (a horizontal axis moving on a fixed vertical axis). We show that gaze point trajectories during eye-head gaze shifts fit the Fick gimbal pattern, with horizontal movements following straight "line of latitude" paths and vertical movements curving like lines of longitude. However, horizontal (and to a lesser extent vertical) movements showed direction-dependent looping, with rightward and leftward (and up and down) saccades tracing slightly different paths. Plots of facing direction (the analogue of gaze direction for the head) also showed the latitude/longitude pattern, without looping. In radial saccades, the gaze point initially moved more vertically than the target direction and then curved; head trajectories were straight. 3. The eye and head components of randomly sequenced gaze shifts were not time locked to one another. The head could start moving at any time from slightly before the eye until 200 ms after, and the standard deviation of this interval could be as large as 80 ms. The head continued moving for a long (up to 400 ms) and highly variable time after the gaze error had fallen to zero. For repeated saccades between the same targets, peak eye and head velocities were directly, but very weakly, correlated; fast eye movements could accompany slow head movements and vice versa. Peak head acceleration and deceleration were also very weakly correlated with eye velocity. Further, the head rotated about an essentially fixed axis, with a smooth bell-shaped velocity profile, whereas the axis of eye rotation relative to the head varied throughout the movement and the velocity profiles were more ragged. 4. Plots of 3D eye orientation revealed strong and consistent looping in eye trajectories relative to space.(ABSTRACT TRUNCATED AT 400 WORDS)


Sensors ◽  
2015 ◽  
Vol 15 (1) ◽  
pp. 995-1007 ◽  
Author(s):  
Seungwon Lee ◽  
Kyungwon Jeong ◽  
Jinho Park ◽  
Joonki Paik

Author(s):  
Elrnar Zeitler

Considering any finite three-dimensional object, a “projection” is here defined as a two-dimensional representation of the object's mass per unit area on a plane normal to a given projection axis, here taken as they-axis. Since the object can be seen as being built from parallel, thin slices, the relation between object structure and its projection can be reduced by one dimension. It is assumed that an electron microscope equipped with a tilting stage records the projectionWhere the object has a spatial density distribution p(r,ϕ) within a limiting radius taken to be unity, and the stage is tilted by an angle 9 with respect to the x-axis of the recording plane.


Sign in / Sign up

Export Citation Format

Share Document