inertial cues
Recently Published Documents


TOTAL DOCUMENTS

26
(FIVE YEARS 2)

H-INDEX

10
(FIVE YEARS 0)

Sensors ◽  
2021 ◽  
Vol 21 (23) ◽  
pp. 8079
Author(s):  
Jose V. Riera ◽  
Sergio Casas ◽  
Marcos Fernández ◽  
Francisco Alonso ◽  
Sergio A. Useche

Motion platforms have been widely used in Virtual Reality (VR) systems for decades to simulate motion in virtual environments, and they have several applications in emerging fields such as driving assistance systems, vehicle automation and road risk management. Currently, the development of new VR immersive systems faces unique challenges to respond to the user’s requirements, such as introducing high-resolution 360° panoramic images and videos. With this type of visual information, it is much more complicated to apply the traditional methods of generating motion cues, since it is generally not possible to calculate the necessary corresponding motion properties that are needed to feed the motion cueing algorithms. For this reason, this paper aims to present a new method for generating non-real-time gravito-inertial cues with motion platforms using a system fed both with computer-generated—simulation-based—images and video imagery. It is a hybrid method where part of the gravito-inertial cues—those with acceleration information—are generated using a classical approach through the application of physical modeling in a VR scene utilizing washout filters, and part of the gravito-inertial cues—the ones coming from recorded images and video, without acceleration information—were generated ad hoc in a semi-manual way. The resulting motion cues generated were further modified according to the contributions of different experts based on a successive approximation—Wideband Delphi-inspired—method. The subjective evaluation of the proposed method showed that the motion signals refined with this method were significantly better than the original non-refined ones in terms of user perception. The final system, developed as part of an international road safety education campaign, could be useful for developing further VR-based applications for key fields such as driving assistance, vehicle automation and road crash prevention.


Author(s):  
Raul Rodriguez ◽  
Benjamin Thomas Crane

Heading direction is perceived based on visual and inertial cues. The current study examined the effect of their relative timing on the ability of offset visual headings to influence inertial perception. Seven healthy human subjects experienced 2 s of translation along a heading of 0°, ±35°, ±70°, ±105°, or ±140°. These inertial headings were paired with 2 s duration visual headings that were presented at relative offsets of 0°, ±30°, ±60°, ±90°, or ±120. The visual stimuli were also presented at 17 temporal delays ranging from -500 ms (visual lead) to 2,000 ms (visual delay) relative to the inertial stimulus. After each stimulus, subjects reported the direction of the inertial stimulus using a dial. The bias of the inertial heading towards the visual heading was robust at ±250 ms when examined across subjects during this period: 8.0 ± 0.5° with a 30° offset, 12.2 ± 0.5° with a 60° offset, 11.7 ± 0.6° with a 90° offset, and 9.8 ± 0.7° with a 120° offset (mean bias towards visual ± SE). The mean bias was much diminished with temporal misalignments of ±500 ms, and there was no longer any visual influence on the inertial heading when the visual stimulus was delayed by 1,000 ms or more. Although the amount of bias varied between subjects the effect of delay was similar.


Author(s):  
Kristen L. Macuga

Objective: The effects of inertial (vestibular and somatosensory) information on driver steering during curve navigation were investigated, using an electric four-wheel mobility vehicle outfitted with a steering wheel and a portable virtual reality system. Background: When driving, multiple sources of perceptual information are available. Researchers have focused on visual information, which plays a critical role in steering control. However, it is not yet well established how inertial information might contribute. Methods: I biased inertial cues by varying visual/inertial gains (doubled, halved, reversed), as drivers negotiated curving paths, and measured steering accuracy and efficiency. I also assessed whether being exposed to inertial biases had an impact on postbias steering by comparing pre- and posttest session performance measures. Results: Doubling or halving inertial cues had little effect on steering performance. Inertial information only disrupted steering when it was reversed with respect to visual information. Over time, the influence of this extreme inertial bias was reduced though not eliminated. Postbias curve navigation performance was not impacted, likely because participants had learned to disregard, rather than integrate, biased inertial cues. Conclusion: Results suggest that biased inertial information has little influence on curve navigation performance when visual information is available. Application: Though inertial cues may be important for open-loop steering, when visual cues are unavailable, their role in closed-loop steering seems less influential. This has implications for driving simulation and suggests that inertial discrepancies due to limitations in motion-cuing capabilities may not be all that problematic for the simulation of closed-loop curve steering tasks.


2013 ◽  
Vol 232 (2) ◽  
pp. 637-646 ◽  
Author(s):  
B. J. Correia Grácio ◽  
J. E. Bos ◽  
M. M. van Paassen ◽  
M. Mulder
Keyword(s):  

2013 ◽  
Vol 231 (2) ◽  
pp. 209-218 ◽  
Author(s):  
K. N. de Winkel ◽  
F. Soyka ◽  
M. Barnett-Cowan ◽  
H. H. Bülthoff ◽  
E. L. Groen ◽  
...  
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document