Visual influence on human locomotion Modulation to changes in optic flow

1997 ◽  
Vol 114 (1) ◽  
pp. 63-70 ◽  
Author(s):  
T. Prokop ◽  
M. Schubert ◽  
W. Berger
2012 ◽  
Vol 24 (7) ◽  
pp. 1781-1805 ◽  
Author(s):  
Szonya Durant ◽  
Johannes M. Zanker

Optic flow motion patterns can be a rich source of information about our own movement and about the structure of the environment we are moving in. We investigate the information available to the brain under real operating conditions by analyzing video sequences generated by physically moving a camera through various typical human environments. We consider to what extent the motion signal maps generated by a biologically plausible, two-dimensional array of correlation-based motion detectors (2DMD) not only depend on egomotion, but also reflect the spatial setup of such environments. We analyzed the local motion outputs by extracting the relative amounts of detected directions and comparing the spatial distribution of the motion signals to that of idealized optic flow. Using a simple template matching estimation technique, we are able to extract the focus of expansion and find relatively small errors that are distributed in characteristic patterns in different scenes. This shows that all types of scenes provide suitable motion information for extracting ego motion despite the substantial levels of noise affecting the motion signal distributions, attributed to the sparse nature of optic flow and the presence of camera jitter. However, there are large differences in the shape of the direction distributions between different types of scenes; in particular, man-made office scenes are heavily dominated by directions in the cardinal axes, which is much less apparent in outdoor forest scenes. Further examination of motion magnitudes at different scales and the location of motion information in a scene revealed different patterns across different scene categories. This suggests that self-motion patterns are not only relevant for deducing heading direction and speed but also provide a rich information source for scene structure and could be important for the rapid formation of the gist of a scene under normal human locomotion.


2003 ◽  
Author(s):  
Matthew J. Ahlert
Keyword(s):  

2020 ◽  
Vol 7 ◽  
Author(s):  
Brock Laschowski ◽  
William McNally ◽  
Alexander Wong ◽  
John McPhee

2021 ◽  
Vol 11 (15) ◽  
pp. 6881
Author(s):  
Calvin Chung Wai Keung ◽  
Jung In Kim ◽  
Qiao Min Ong

Virtual reality (VR) is quickly becoming the medium of choice for various architecture, engineering, and construction applications, such as design visualization, construction planning, and safety training. In particular, this technology offers an immersive experience to enhance the way architects review their design with team members. Traditionally, VR has used a desktop PC or workstation setup inside a room, yielding the risk of two users bump into each other while using multiuser VR (MUVR) applications. MUVR offers shared experiences that disrupt the conventional single-user VR setup, where multiple users can communicate and interact in the same virtual space, providing more realistic scenarios for architects in the design stage. However, this shared virtual environment introduces challenges regarding limited human locomotion and interactions, due to physical constraints of normal room spaces. This study thus presented a system framework that integrates MUVR applications into omnidirectional treadmills. The treadmills allow users an immersive walking experience in the simulated environment, without space constraints or hurt potentialities. A prototype was set up and tested in several scenarios by practitioners and students. The validated MUVR treadmill system aims to promote high-level immersion in architectural design review and collaboration.


Sign in / Sign up

Export Citation Format

Share Document