Perceptual scaling of visual and inertial cues

2013 ◽  
Vol 232 (2) ◽  
pp. 637-646 ◽  
Author(s):  
B. J. Correia Grácio ◽  
J. E. Bos ◽  
M. M. van Paassen ◽  
M. Mulder
Keyword(s):  
2010 ◽  
Vol 10 (12) ◽  
pp. 1-1 ◽  
Author(s):  
K. N. de Winkel ◽  
J. Weesie ◽  
P. J. Werkhoven ◽  
E. L. Groen
Keyword(s):  

Author(s):  
Florent Colombet ◽  
Andras Kemeny ◽  
Fre´de´ric Me´rienne ◽  
Christian Pe`re

By studying drivers’ behavior, driving simulation is used in automotive industry for designing and testing new driving aid systems. In order to have a behavior similar as much as possible to the one observed in real conditions, driver has to be provided with visual, audio or kinesthesic cues as well as inertial cues. A “one to one” motion rendering is usually not possible due to physical limitations of dynamic driving simulators, so a so-called “motion cueing algorithm” is used to transform virtual vehicle trajectory into admissible simulator trajectory. Our knowledge of the human motion perception is currently incomplete. A way to improve motion rendering is to increase driving simulators physical abilities. “High performance” driving simulators thus obtained can provide inertial cues nearer to those in real conditions, but they need large simulation rooms and complex operational facilities. The second manner to improve motion rendering is to develop new motion cueing algorithms, and this is what is proposed in this paper. In the framework of a partnership between Arts & Me´tiers ParisTech and Renault, a new dynamic simulator called SAM has been built. This simulator is equipped with traditional hexapod motion-platform, nevertheless it is using an innovative motion cueing algorithm. In this paper, an overview of existing motion cueing algorithms will be presented, especially their limitations and the relevance of a predictive algorithm. Finally, an experiment will be also presented for comparison of the different cueing algorithms.


2003 ◽  
Vol 10 (4) ◽  
pp. 987-993 ◽  
Author(s):  
David Waller ◽  
Jack M. Loomis ◽  
Sibylle D. Steck
Keyword(s):  

2009 ◽  
Vol 198 (2-3) ◽  
pp. 287-300 ◽  
Author(s):  
Daniel R. Berger ◽  
Heinrich H. Bülthoff
Keyword(s):  

1996 ◽  
Vol 199 (1) ◽  
pp. 201-209 ◽  
Author(s):  
A S Etienne ◽  
R Maurer ◽  
V Séguinot

During locomotion, mammals update their position with respect to a fixed point of reference, such as their point of departure, by processing inertial cues, proprioceptive feedback and stored motor commands generated during locomotion. This so-called path integration system (dead reckoning) allows the animal to return to its home, or to a familiar feeding place, even when external cues are absent or novel. However, without the use of external cues, the path integration process leads to rapid accumulation of errors involving both the direction and distance of the goal. Therefore, even nocturnal species such as hamsters and mice rely more on previously learned visual references than on the path integration system when the two types of information are in conflict. Recent studies investigate the extent to which path integration and familiar visual cues cooperate to optimize the navigational performance.


Author(s):  
Raul Rodriguez ◽  
Benjamin Thomas Crane

Heading direction is perceived based on visual and inertial cues. The current study examined the effect of their relative timing on the ability of offset visual headings to influence inertial perception. Seven healthy human subjects experienced 2 s of translation along a heading of 0°, ±35°, ±70°, ±105°, or ±140°. These inertial headings were paired with 2 s duration visual headings that were presented at relative offsets of 0°, ±30°, ±60°, ±90°, or ±120. The visual stimuli were also presented at 17 temporal delays ranging from -500 ms (visual lead) to 2,000 ms (visual delay) relative to the inertial stimulus. After each stimulus, subjects reported the direction of the inertial stimulus using a dial. The bias of the inertial heading towards the visual heading was robust at ±250 ms when examined across subjects during this period: 8.0 ± 0.5° with a 30° offset, 12.2 ± 0.5° with a 60° offset, 11.7 ± 0.6° with a 90° offset, and 9.8 ± 0.7° with a 120° offset (mean bias towards visual ± SE). The mean bias was much diminished with temporal misalignments of ±500 ms, and there was no longer any visual influence on the inertial heading when the visual stimulus was delayed by 1,000 ms or more. Although the amount of bias varied between subjects the effect of delay was similar.


Sign in / Sign up

Export Citation Format

Share Document