Vertical linear self-motion perception during visual and inertial motion: More than weighted summation of sensory inputs

2005 ◽  
Vol 15 (4) ◽  
pp. 185-195 ◽  
Author(s):  
W.G. Wright ◽  
P. DiZio ◽  
J.R. Lackner

We evaluated visual and vestibular contributions to vertical self motion perception by exposing subjects to various combinations of 0.2 Hz vertical linear oscillation and visual scene motion. The visual stimuli presented via a head-mounted display consisted of video recordings of the test chamber from the perspective of the subject seated in the oscillator. In the dark, subjects accurately reported the amplitude of vertical linear oscillation with only a slight tendency to underestimate it. In the absence of inertial motion, even low amplitude oscillatory visual motion induced the perception of vertical self-oscillation. When visual and vestibular stimulation were combined, self-motion perception persisted in the presence of large visual-vestibular discordances. A dynamic visual input with magnitude discrepancies tended to dominate the resulting apparent self-motion, but vestibular effects were also evident. With visual and vestibular stimulation either spatially or temporally out-of-phase with one another, the input that dominated depended on their amplitudes. High amplitude visual scene motion was almost completely dominant for the levels tested. These findings are inconsistent with self-motion perception being determined by simple weighted summation of visual and vestibular inputs and constitute evidence against sensory conflict models. They indicate that when the presented visual scene is an accurate representation of the physical test environment, it dominates over vestibular inputs in determining apparent spatial position relative to external space.

2006 ◽  
Vol 16 (1-2) ◽  
pp. 23-28 ◽  
Author(s):  
W. Geoffrey Wright ◽  
Paul DiZio ◽  
James R. Lackner

We evaluated the influence of moving visual scenes and knowledge of spatial and physical context on visually induced self-motion perception in an immersive virtual environment. A sinusoidal, vertically oscillating visual stimulus induced perceptions of self-motion that matched changes in visual acceleration. Subjects reported peaks of perceived self-motion in synchrony with peaks of visual acceleration and opposite in direction to visual scene motion. Spatial context was manipulated by testing subjects in the environment that matched the room in the visual scene or by testing them in a separate chamber. Physical context was manipulated by testing the subject while seated in a stable, earth-fixed desk chair or in an apparatus capable of large linear motions, however, in both conditions no actual motion occurred. The compellingness of perceived self-motion was increased significantly when the spatial context matched the visual input and actual body displacement was possible, however, the latency and amplitude of perceived self-motion were unaffected by the spatial or physical context. We propose that two dissociable processes are involved in self-motion perception: one process, primarily driven by visual input, affects vection latency and path integration, the other process, receiving cognitive input, drives the compellingness of perceived self-motion.


2017 ◽  
Vol 30 (1) ◽  
pp. 65-90 ◽  
Author(s):  
Séamas Weech ◽  
Nikolaus F. Troje

Studies of the illusory sense of self-motion elicited by a moving visual surround (‘vection’) have revealed key insights about how sensory information is integrated. Vection usually occurs after a delay of several seconds following visual motion onset, whereas self-motion in the natural environment is perceived immediately. It has been suggested that this latency relates to the sensory mismatch between visual and vestibular signals at motion onset. Here, we tested three techniques with the potential to reduce sensory mismatch in order to shorten vection onset latency: noisy galvanic vestibular stimulation (GVS) and bone conducted vibration (BCV) at the mastoid processes, and body vibration applied to the lower back. In Experiment 1, we examined vection latency for wide field visual rotations about the roll axis and applied a burst of stimulation at the start of visual motion. Both GVS and BCV reduced vection latency by two seconds compared to the control condition, whereas body vibration had no effect on latency. In Experiment 2, the visual stimulus rotated about the pitch, roll, or yaw axis and we found a similar facilitation of vection by both BCV and GVS in each case. In a control experiment, we confirmed that air-conducted sound administered through headphones was not sufficient to reduce vection onset latency. Together the results suggest that noisy vestibular stimulation facilitates vection, likely due to an upweighting of visual information caused by a reduction in vestibular sensory reliability.


2014 ◽  
Vol 112 (10) ◽  
pp. 2481-2491 ◽  
Author(s):  
Sebastian M. Frank ◽  
Oliver Baumann ◽  
Jason B. Mattingley ◽  
Mark W. Greenlee

The central hub of the cortical vestibular network in humans is likely localized in the region of posterior lateral sulcus. An area characterized by responsiveness to visual motion has previously been described at a similar location and named posterior insular cortex (PIC). Currently it is not known whether PIC processes vestibular information as well. We localized PIC using visual motion stimulation in functional magnetic resonance imaging (fMRI) and investigated whether PIC also responds to vestibular stimuli. To this end, we designed an MRI-compatible caloric stimulation device that allowed us to stimulate bithermally with hot temperature in one ear and simultaneously cold temperature in the other or with warm temperatures in both ears for baseline. During each trial, participants indicated the presence or absence of self-motion sensations. We found activation in PIC during periods of self motion when vestibular stimulation was carried out with minimal visual input. In combined visual-vestibular stimulation area PIC was activated in a similar fashion during congruent and incongruent stimulation conditions. Our results show that PIC not only responds to visual motion but also to vestibular stimuli related to the sensation of self motion. We suggest that PIC is part of the cortical vestibular network and plays a role in the integration of visual and vestibular stimuli for the perception of self motion.


2013 ◽  
Vol 26 (3) ◽  
pp. 277-285 ◽  
Author(s):  
Shinji Nakamura

It has been repeatedly reported that visual stimuli containing a jittering/oscillating motion component can induce self-motion perception more strongly than a pure radial expansion pattern. A psychophysical experiment with 11 observers revealed that the additional accelerating components of the visual motion have to be convoluted with the motion of the main-axis to facilitate self-motion perception; additional motion presented in an isolated fashion impairs the perception of self-motion. These results are inconsistent with a simple hypothesis about the perceptual mechanism underlying the advantage of jitter/oscillation, which assumes that the accelerating component induces an additional self-motion independently of the main motion at the first stage, and then the two self-motions induced by the main motion and the additional component become integrated.


Perception ◽  
10.1068/p5037 ◽  
2003 ◽  
Vol 32 (4) ◽  
pp. 475-484 ◽  
Author(s):  
Michiteru Kitazaki ◽  
Takao Sato

Attentional effects on self-motion perception (vection) were examined by using a large display in which vertical stripes containing upward or downward moving dots were interleaved to balance the total motion energy for the two directions. The dots moving in the same direction had the same colour, and subjects were asked to attend to one of the two colours. Vection was perceived in the direction opposite to that of non-attended motion. This indicates that non-attended visual motion dominates vection. The attentional effect was then compared with effects of relative depth. Clear attentional effects were again found when there was no relative depth between dots moving in opposite directions, but the effect of depth was much stronger for stimuli with a relative depth. Vection was mainly determined by motion in the far depth plane, although some attentional effects were evident even in this case. These results indicate that attentional modulation for vection exists, but that it is overridden when there is a relative depth between the two motion components.


Author(s):  
Tyler S. Manning ◽  
Kenneth H. Britten

The ability to see motion is critical to survival in a dynamic world. Decades of physiological research have established that motion perception is a distinct sub-modality of vision supported by a network of specialized structures in the nervous system. These structures are arranged hierarchically according to the spatial scale of the calculations they perform, with more local operations preceding those that are more global. The different operations serve distinct purposes, from the interception of small moving objects to the calculation of self-motion from image motion spanning the entire visual field. Each cortical area in the hierarchy has an independent representation of visual motion. These representations, together with computational accounts of their roles, provide clues to the functions of each area. Comparisons between neural activity in these areas and psychophysical performance can identify which representations are sufficient to support motion perception. Experimental manipulation of this activity can also define which areas are necessary for motion-dependent behaviors like self-motion guidance.


Perception ◽  
10.1068/p7184 ◽  
2012 ◽  
Vol 41 (5) ◽  
pp. 577-593 ◽  
Author(s):  
Takeharu Seno ◽  
Emi Hasuo ◽  
Hiroyuki Ito ◽  
Yoshitaka Nakajima

We examined whether and how sounds influence visually induced illusory self-motion (vection). Visual stimuli were presented for 40 s. They were made radially, expanding or contracting visual motion field and luminance-defined gratings drifting in a vertical or horizontal direction. Auditory stimuli were presented with the visual stimuli in most conditions; we employed sounds that increased or decreased in intensity, or ascended or descended in frequency. As a result, the sound which increased in intensity facilitated forward vection, and the sound which ascended/descended in frequency facilitated upward/downward vection. The perceptual plausibility of the sound for the corresponding self-motion seemed an important factor of enhancing vection.


2020 ◽  
Vol 1 ◽  
Author(s):  
Kanon Fujimoto ◽  
Hiroshi Ashida

Humans perceive self-motion using multisensory information, while vision has a dominant role as is utilized in virtual reality (VR) technologies. Previous studies reported that visual motion presented in the lower visual field (LoVF) induces stronger illusion of self-motion (vection) as compared with the upper visual field (UVF). However, it was still unknown whether the LoVF superiority in vection was based on the retinotopic frame, or rather related to the environmental frame of reference. Here, we investigated the influences of retinotopic and environmental frames on the LoVF superiority of vection. We presented a planer surface along the depth axis in one of four visual fields (upper, lower, right, or left). The texture on the surface moved forward or backward. Participants reported vection while observing the visual stimulus through a VR head mounted display (HMD) in the sitting posture or lateral recumbent position. Results showed that the visual motion induced stronger vection when presented in the LoVF compared with the UVF in both postures. Notably, the vection rating in LoVF was stronger in the sitting than in the recumbent. Moreover, recumbent participants reported stronger vection when the stimulus was presented in the gravitationally lower field than in the gravitationally upper field. These results demonstrate contribution of multiple spatial frames on self-motion perception and imply the importance of ground surface.


Sign in / Sign up

Export Citation Format

Share Document