Head tracking latency in virtual environments: Psychophysics and a model

Author(s):  
Bernard D. Adelstein ◽  
Thomas G. Lee ◽  
Stephen R. Ellis
Author(s):  
Bernard D. Adelstein ◽  
Thomas G. Lee ◽  
Stephen R. Ellis

1996 ◽  
Vol 5 (3) ◽  
pp. 274-289 ◽  
Author(s):  
Claudia Hendrix ◽  
Woodrow Barfield

This paper reports the results of three studies, each of which investigated the sense of presence within virtual environments as a function of visual display parameters. These factors included the presence or absence of head tracking, the presence or absence of stereoscopic cues, and the geometric field of view used to create the visual image projected on the visual display. In each study, subjects navigated a virtual environment and completed a questionnaire designed to ascertain the level of presence experienced by the participant within the virtual world. Specifically, two aspects of presence were evaluated: (1) the sense of “being there” and (2) the fidelity of the interaction between the virtual environment participant and the virtual world. Not surprisingly, the results of the first and second study indicated that the reported level of presence was significantly higher when head tracking and stereoscopic cues were provided. The results from the third study showed that the geometric field of view used to design the visual display highly influenced the reported level of presence, with more presence associated with a 50 and 90° geometric field of view when compared to a narrower 10° geometric field of view. The results also indicated a significant positive correlation between the reported level of presence and the fidelity of the interaction between the virtual environment participant and the virtual world. Finally, it was shown that the survey questions evaluating several aspects of presence produced reliable responses across questions and studies, indicating that the questionnaire is a useful tool when evaluating presence in virtual environments.


2014 ◽  
Vol 4 (2) ◽  
pp. 1
Author(s):  
Vitor Reus ◽  
Márcio Mello ◽  
Luciana Nedel ◽  
Anderson Maciel

Head-mounted displays (HMD) allow a personal and immersive viewing of virtual environments, and can be used with almost any desktop computer. Most HMDs have inertial sensors embedded for tracking the user head rotations. These low-cost sensors have high quality and availability. However, even if they are very sensitive and precise, inertial sensors work with incremental information, easily introducing errors in the system. The most relevant is that head tracking suffers from drifting. In this paper we present important limitations that still prevent the wide use of inertial sensors for tracking. For instance, to compensate for the drifting, users of HMD-based immersive VEs move away from their suitable pose. We also propose a software solution for two problems: prevent the occurrence of drifting in incremental sensors, and avoid the user from move its body in relation to another tracking system that uses absolute sensors (e.g. MS Kinect). We analyze and evaluate our solutions experimentally, including user tests. Results show that our comfortable pose function is effective on eliminating drifting, and that it can be inverted and applied also to prevent the user from moving their body away of the absolute sensor range. The efficiency and accuracy of this method makes it suitable for a number of applications in immersive VR.


2007 ◽  
Vol 16 (1) ◽  
pp. 45-64 ◽  
Author(s):  
Sangyoon Lee ◽  
Tian Chen ◽  
Jongseo Kim ◽  
Gerard Jounghyun Kim ◽  
Sung Ho Han ◽  
...  

Product design is an iterative process that involves, among other things, evaluation. In addition to the intended functionality of the product, its affective properties (or “Kansei”) have emerged as important evaluation criteria for the successful marketing of the product. Affective properties refer to consumers' psychological feelings about a product, and they can be mapped into perceptual design elements for possible design modification toward higher customer satisfaction. Affective properties of products in design can partially be assessed using the near photorealistic graphic rendering feature of the desktop computer-aided design tools, or rapid prototyping tools that can produce physical mock-ups. Recently, immersive virtual reality systems have been suggested as an ideal platform for affective analysis of an evolving design because of, among other things, the natural style of interaction they offer when examining the product, such as the use of direct and proprioceptive interaction, head tracking and first-person viewpoint, and multimodality. In this paper, the effects of tactile augmentation and self-body visualization on the evaluation of the affective property are investigated by comparing three types of virtual environments for evaluating the affective properties of mobile phones. Each virtual environment offers different degrees of tactile and self-body realism. The effectiveness of these virtual environments is evaluated, compared to a control condition: the affective assessment of using the real product. The experiment has shown that the virtual affective evaluation results from the three systems correlated very highly with that of the real product, and no statistically significant differences could be found among the three systems. This finding indicates that tactile augmentation and the high-fidelity self-body visualization had no effect on the evaluation of the affective property. Nevertheless, the experimental results have indicated the importance of enhanced interaction with tactile augmentation for evaluating the property of texture, and have shown that VR systems have the potential for use as affective evaluation platforms.


2021 ◽  
Vol 12 ◽  
Author(s):  
Chloe Callahan-Flintoft ◽  
Christian Barentine ◽  
Jonathan Touryan ◽  
Anthony J. Ries

Using head mounted displays (HMDs) in conjunction with virtual reality (VR), vision researchers are able to capture more naturalistic vision in an experimentally controlled setting. Namely, eye movements can be accurately tracked as they occur in concert with head movements as subjects navigate virtual environments. A benefit of this approach is that, unlike other mobile eye tracking (ET) set-ups in unconstrained settings, the experimenter has precise control over the location and timing of stimulus presentation, making it easier to compare findings between HMD studies and those that use monitor displays, which account for the bulk of previous work in eye movement research and vision sciences more generally. Here, a visual discrimination paradigm is presented as a proof of concept to demonstrate the applicability of collecting eye and head tracking data from an HMD in VR for vision research. The current work’s contribution is 3-fold: firstly, results demonstrating both the strengths and the weaknesses of recording and classifying eye and head tracking data in VR, secondly, a highly flexible graphical user interface (GUI) used to generate the current experiment, is offered to lower the software development start-up cost of future researchers transitioning to a VR space, and finally, the dataset analyzed here of behavioral, eye and head tracking data synchronized with environmental variables from a task specifically designed to elicit a variety of eye and head movements could be an asset in testing future eye movement classification algorithms.


Author(s):  
Michael K. McGee

A study was done to determine the effectiveness of using free modulus magnitude estimation, a psychophysical measurement technique, to assess the experience of negative side effects in users of immersive virtual environments (VEs). Two different task environments, a maze and an office, were factorially combined with head-tracking on or off to provide multiple levels of side effect inducing conditions. Sixteen subjects participated in a four day experiment. Both head-tracking and task environment showed significant main effects. The experiment showed that magnitude estimation is a sensitive, efficient, and effective measure of negative side effects experienced in users of immersive VEs.


1999 ◽  
Vol 8 (2) ◽  
pp. 237-240 ◽  
Author(s):  
Woodrow Barfield ◽  
Claudia Hendrix ◽  
Karl-Erik Bystrom

This study investigated performance in a desktop virtual environment as a function of stereopsis and head tracking. Ten subjects traced a computer-generated wire using a virtual stylus that was slaved to the position of a real-world stylus tracked with a 6-DOF position sensor. The objective of the task was to keep the virtual stylus centered on the wire. Measures collected as the subjects performed the task were performance time, and number of times the stylus overstepped the virtual wire. The time to complete the wire-tracing task was significantly reduced by the addition of stereopsis, but was not affected by the presence of head tracking. The number of times the virtual stylus overstepped the wire was significantly reduced when head-tracking cues were available, but was not affected by the presence of stereoscopic cues. Implications of the results for performance using desktop virtual environments are discussed.


Sign in / Sign up

Export Citation Format

Share Document