Effects of Observer Orientation on Perception of Ego- and Exocentric Spatial Locations

Perception ◽  
1997 ◽  
Vol 26 (1_suppl) ◽  
pp. 237-237
Author(s):  
J Li ◽  
M M Cohen ◽  
C W DeRoshia ◽  
L T Guzy

Perceived eye position and/or the perceived location of visual targets are altered when the orientation of the surrounding visual environment (Cohen et al, 1995 Perception & Psychophysics571 433) or that of the observer (Cohen and Guzy, 1995 Aviation, Space, and Environmental Medicine66 505) is changed. Fourteen subjects used biteboards as they lay on a rotary bed that was oriented head-down −15°, −7.5°, supine, head-up +7.5°, and +15°. In the dark, subjects directed their gaze and set a target to the apparent zenith (exocentric location); they also gazed at a subjective ‘straight ahead’ position with respect to their head (egocentric location). Angular deviations of target settings and changes in vertical eye position were recorded with an ISCAN infrared tracking system. Results indicated that, for exocentric locations, the eyes deviate systematically from the true zenith. The gain for compensating changes in head orientation was 0.69 and 0.73 for gaze direction and target settings, respectively. In contrast, ‘straight ahead’ eye positions were not significantly affected by changes in the subject's orientation. We conclude that subjects make systematic errors when directing their gaze to an exocentric location in near-supine positions. This suggests a systematic bias in the integration of extra-ocular signals with information regarding head orientation. The bias may result from underestimating changes in the orientation of the head in space. In contrast, for egocentric locations, where head orientation information can potentially be discarded, gaze directions were unaffected by head orientation near supine.

Perception ◽  
10.1068/p3440 ◽  
2002 ◽  
Vol 31 (11) ◽  
pp. 1323-1333 ◽  
Author(s):  
Ellen M Berends ◽  
Raymond van Ee ◽  
Casper J Erkelens

It has been well established that vertical disparity is involved in perception of the three-dimensional layout of a visual scene. The goal of this paper was to examine whether vertical disparities can alter perceived direction. We dissociated the common relationship between vertical disparity and the stimulus direction by applying a vertical magnification to the image presented to one eye. We used a staircase paradigm to measure whether perceived straight-ahead depended on the amount of vertical magnification in the stimulus. Subjects judged whether a test dot was flashed to either the left or the right side of straight-ahead. We found that perceived straight-ahead did indeed depend on the amount of vertical magnification but only after subjects adapted (for 5 min) to vertical scale (and only in five out of nine subjects). We argue that vertical disparity is a factor in the calibration of the relationship between eye-position signals and perceived direction.


1976 ◽  
Vol 43 (2) ◽  
pp. 487-493 ◽  
Author(s):  
Robert I. Bermant ◽  
Robert B. Welch

Subjects were exposed to a visual and to an auditory stimulus that differed spatially in laterality of origin. The subjects were observed for visual biasing of auditory localization (the momentary influence of a light on the spatially perceived location of a simultaneously presented sound) and for auditory aftereffect (a change in perceived location of a sound that persists over time and is measured after termination of the visual stimulus). A significant effect of visual stimulation on auditory localization was found only with the measure of bias. Bias was tested as a function of degree of visual-auditory separation (10/20/30°), eye position (straight-ahead/visual stimulus fixation), and position of visual stimulus relative to auditory stimulus (left/right). Only eye position proved statistically significant; straight-ahead eye position induced more bias than did fixation of the visual stimulus.


Perception ◽  
1996 ◽  
Vol 25 (1_suppl) ◽  
pp. 41-41
Author(s):  
J T Enright

Perception of visual direction was investigated by requiring subjects repeatedly to adjust a single small light, in an otherwise darkened room, to perceived ‘straight ahead’. This task presumably requires comparing concurrent extra-retinal information (either proprioception or an efference copy) with an internally stored ‘standard’ of comparison. Moment-to-moment precision in that performance is remarkably good, with median threshold (standard deviation) of 47 arc min. Nevertheless, the responses often involved a monotonic shift of direction over a few minutes during a test session in this reduced visual environment. These trends led to final settings that were immediately recognised as grossly erroneous when the room was relit, implying that the presumptive internal standard of comparison, while unstable, can be rapidly updated in a full visual environment. There are clear similarities between this phenomenon and the sudden ‘visual capture’ that occurs in a re-illuminated room, following distortions of visual direction that arose in a similarly reduced setting for subjects whose extraocular muscles were partially paralysed (Matin et al, 1982 Science216 198 – 201). In both cases, the visual stimuli that underlie rapid recalibration are unknown. Among the several possibilities that can be imagined, the strongest candidate hypothesis for this calibration of the straight-ahead direction is that, during fixation in a lit room, one utilises the directional distribution of image motion that arises because of microscale drift of the eye, as it moves toward its equilibrium orientation, much as a moving observer can use optic flow to evaluate ‘heading’ (the dynamic analogue of ‘straight ahead’).


2019 ◽  
Vol 11 (6) ◽  
Author(s):  
John Papayanopoulos ◽  
Kevin Webb ◽  
Jonathan Rogers

Abstract Unmanned aerial vehicles are increasingly being tasked to connect to payload objects or docking stations for the purposes of package transport or recharging. However, autonomous docking creates challenges in that the air vehicle must precisely position itself with respect to the dock, oftentimes in the presence of uncertain winds and measurement errors. This paper describes an autonomous docking mechanism comprising a static ring and actuated legs, coupled with an infrared tracking device for closed-loop docking maneuvers. The dock’s unique mechanical design enables precise passive positioning such that the air vehicle slides into a precise location and orientation in the dock from a wide range of entry conditions. This leads to successful docking in the presence of winds and sensor measurement errors. A closed-loop infrared tracking system is also described in which the vehicle tracks an infrared beacon located on the dock during the descent to landing. A detailed analysis is presented describing the interaction dynamics between the aircraft and the dock, and system parameters are optimized through the use of trade studies and Monte Carlo analysis with a three degree-of-freedom simulation model. Experimental results are presented demonstrating successful docking maneuvers of an autonomous air vehicle in both indoor and outdoor environments. These repeatable docking experiments verify the robustness and practical utility of the dock design for a variety of emerging applications.


1997 ◽  
Vol 78 (4) ◽  
pp. 2203-2216 ◽  
Author(s):  
Bernhard J. M. Hess ◽  
Dora E. Angelaki

Hess, Bernhard J. M. and Dora E. Angelaki. Kinematic principles of primate rotational vestibulo-ocular reflex. II. Gravity-dependent modulation of primary eye position. J. Neurophysiol. 78: 2203–2216, 1997. The kinematic constraints of three-dimensional eye positions were investigated in rhesus monkeys during passive head and body rotations relative to gravity. We studied fast and slow phase components of the vestibulo-ocular reflex (VOR) elicited by constant-velocity yaw rotations and sinusoidal oscillations about an earth-horizontal axis. We found that the spatial orientation of both fast and slow phase eye positions could be described locally by a planar surface with torsional variation of <2.0 ± 0.4° (displacement planes) that systematically rotated and/or shifted relative to Listing's plane. In supine/prone positions, displacement planes pitched forward/backward; in left/right ear-down positions, displacement planes were parallel shifted along the positive/negative torsional axis. Dynamically changing primary eye positions were computed from displacement planes. Torsional and vertical components of primary eye position modulated as a sinusoidal function of head orientation in space. The torsional component was maximal in ear-down positions and approximately zero in supine/prone orientations. The opposite was observed for the vertical component. Modulation of the horizontal component of primary eye position exhibited a more complex dependence. In contrast to the torsional component, which was relatively independent of rotational speed, modulation of the vertical and horizontal components of primary position depended strongly on the speed of head rotation (i.e., on the frequency of oscillation of the gravity vector component): the faster the head rotated relative to gravity, the larger was the modulation. Corresponding results were obtained when a model based on a sinusoidal dependence of instantaneous displacement planes (and primary eye position) on head orientation relative to gravity was fitted to VOR fast phase positions. When VOR fast phase positions were expressed relative to primary eye position estimated from the model fits, they were confined approximately to a single plane with a small torsional standard deviation (∼1.4–2.6°). This reduced torsional variation was in contrast to the large torsional spread (well >10–15°) of fast phase positions when expressed relative to Listing's plane. We conclude that primary eye position depends dynamically on head orientation relative to space rather than being fixed to the head. It defines a gravity-dependent coordinate system relative to which the torsional variability of eye positions is minimized even when the head is moved passively and vestibulo-ocular reflexes are evoked. In this general sense, Listing's law is preserved with respect to an otolith-controlled reference system that is defined dynamically by gravity.


2018 ◽  
Vol 2018 ◽  
pp. 1-7
Author(s):  
Yong Huang ◽  
Shengqi Chang ◽  
Songhe Qin ◽  
Peijia Li ◽  
Xiaogong Hu ◽  
...  

To improve the lunar DEM accuracy derived from CE-1 altimeter data, CE-1 laser altimeter data are calibrated in this paper. Orbit accuracy and ranging accuracy are the two most important factors to affect the application of altimeter data in the lunar topography. An empirical method is proposed to calibrate CE-1 altimeter data, using gridded LOLA DEM to correct systematic errors of CE-1 altimeter data, and the systematic bias is about -139.52 m. A new lunar DEM grid model based on calibrated CE-1 altimeter data with the spatial resolution of 0.0625°  × 0.0625° is obtained as well as a spherical harmonic model at 1400th order. Furthermore, the DEM accuracy is assessed through the comparison with the nearside landmarks of the Moon, and the results show that the DEM accuracy is improved from 127.3 m to 48.7 m after the calibration of laser altimeter data.


2021 ◽  
Vol 21 (3) ◽  
Author(s):  
Alban Lemasson ◽  
Daria Lippi ◽  
Laura Hamelin ◽  
Stéphane Louazon ◽  
Martine Hausberger

Abstract Human emotions guide verbal and non-verbal behaviour during social encounters. During public performances, performers’ emotions can be affected directly by an audience’s attitude. The valence of the emotional state (positive or negative) of a broad range of animal species is known to be associated with a body and visual orientation laterality bias. Here, we evaluated the influence of an audience’s attitude on professional actors’ head orientation and gaze direction during two theatrical performances with controlled observers’ reactions (Hostile vs Friendly audience). First, our speech fluency analysis confirmed that an audience’s attitude influenced actors’ emotions. Second, we found that, whereas actors oriented more their head to the left (i.e. Right Hemisphere Bias) when the audience was hostile, they gazed more straight ahead at Friendly spectators. These results are in accordance with the Valence-Specific Hypothesis that proposes that processing stimuli with negative valences involves the right hemisphere (i.e. left eye) more than the left hemisphere.


Sign in / Sign up

Export Citation Format

Share Document