scholarly journals Interpretation of Depth from Scaled Motion Parallax in Virtual Reality

2021 ◽  
Vol 21 (9) ◽  
pp. 2035
Author(s):  
Xue Teng ◽  
Laurie Wilcox ◽  
Robert Allison
2017 ◽  
Author(s):  
Tobias Navarro Schröder ◽  
Benjamin W. Towse ◽  
Matthias Nau ◽  
Neil Burgess ◽  
Caswell Barry ◽  
...  

SummaryMinimizing spatial uncertainty is essential for navigation, but the neural mechanisms remain elusive. Here we combine predictions of a simulated grid cell system with behavioural and fMRI measures in humans during virtual navigation. First, we showed that polarising cues produce anisotropy in motion parallax. Secondly, we simulated entorhinal grid cells in an environment with anisotropic information and found that self-location is decoded best when grid-patterns are aligned with the axis of greatest information. Thirdly, when exposing human participants to polarised virtual reality environments, we found that navigation performance is anisotropic, in line with the use of parallax. Eye movements showed that participants preferentially viewed polarising cues, which correlated with navigation performance. Finally, using fMRI we found that the orientation of grid-cell-like representations in entorhinal cortex anchored to the environmental axis of greatest parallax information, orthogonal to the polarisation axis. In sum, we demonstrate a crucial role of the entorhinal grid system in reducing uncertainty in representations of self-location and find evidence for adaptive spatial computations underlying entorhinal representations in service of optimal navigation.


2012 ◽  
Vol 25 (0) ◽  
pp. 31
Author(s):  
Michiteru Kitazaki

Since the speed of sound is much slower than light, we sometimes hear a sound later than an accompanying light event (e.g., thunder and lightning at a far distance). However, Sugita and Suzuki (2003) reported that our brain coordinates a sound and its accompanying light to be perceived simultaneously within 20 m distance. Thus, the light accompanied with physically delayed sound is perceived simultaneously with the sound in near field. We aimed to test if this sound–light coordination occurs in a virtual-reality environment and investigate effects of binocular disparity and motion parallax. Six naive participants observed visual stimuli on a 120-inch screen in a darkroom and heard auditory stimuli from a headphone. A ball was presented in a textured corridor and its distance from the participant was varied from 3–20 m. The ball changed to be in red before or after a short (10 ms) white noise (time difference: −120, −60, −30, 0, +30, +60, +120 ms), and participants judged temporal order of the color-change and the sound. We varied visual depth cues (binocular disparity and motion parallax) in the virtual-reality environment, and measured the physical delay at which visual and auditory events were perceived simultaneously. In terms of the results, we did not find sound–light coordination without binocular disparity or motion parallax, but found it with both cues. These results suggest that binocular disparity and motion parallax are effective for sound–light coordination in virtual-reality environment, and richness of depth cues are important for the coordination.


2020 ◽  
Vol 27 (2) ◽  
pp. 206-225 ◽  
Author(s):  
Sirisilp Kongsilp ◽  
Matthew N. Dailey

Since one of the most important aspects of a Fish Tank Virtual Reality (FTVR) system is how well it provides the illusion of depth to users, we present a study that evaluates users' depth perception in FTVR systems using three tasks. The tasks are based on psychological research on human vision and depth judgments common in VR applications. We find that participants do not perform well under motion parallax cues only, when compared with stereo only or a combination of both kinds of cues. Measurements of participants' head movement during each task prove valuable in explaining the experimental findings. We conclude that FTVR users rely on stereopsis for depth perception in FTVR environments more than they do on motion parallax, especially for tasks requiring depth acuity.


2020 ◽  
Vol 3 (2) ◽  
pp. 20502-1-20502-10 ◽  
Author(s):  
Siavash Eftekharifar ◽  
Anne Thaler ◽  
Nikolaus F. Troje

Abstract The sense of presence is defined as a subjective feeling of being situated in an environment and occupying a location therein. The sense of presence is a defining feature of virtual environments. In two experiments, we aimed at investigating the relative contribution of motion parallax and stereopsis to the sense of presence, using two versions of the classic pit room paradigm in virtual reality. In Experiment 1, participants were asked to cross a deep abyss between two platforms on a narrow plank. Participants completed the task under three experimental conditions: (1) when the lateral component of motion parallax was disabled, (2) when stereopsis was disabled, and (3) when both stereopsis and motion parallax were available. As a subjective measure of presence, participants completed a presence questionnaire after each condition. Additionally, electrodermal activity (EDA) was recorded as a measure of anxiety. In Experiment 1, EDA responses were significantly higher with restricted motion parallax as compared to the other two conditions. However, no difference was observed in terms of the subjective presence scores across the three conditions. To test whether these results were due to the nature of the environment, participants in Experiment 2 experienced a slightly less stressful environment, where they were asked to stand on a ledge and drop virtual balls to specified targets into the abyss. The same experimental manipulations were used as in Experiment 1. Again, the EDA responses were significantly higher when motion parallax was impaired as compared to when stereopsis was disabled. The results of the presence questionnaire revealed a reduced sense of presence with impaired motion parallax compared to the normal viewing condition. Across the two experiments, our results unexpectedly demonstrate that presence in the virtual environments is not necessarily linked to EDA responses elicited by affective situations as has been implied by earlier studies.


Sign in / Sign up

Export Citation Format

Share Document