Automated calibration of display characteristics (ACDC) for head-mounted displays and arbitrary surfaces

Author(s):  
J. Adam Jones ◽  
Lauren Cairco Dukes ◽  
Mark Bolas
2011 ◽  
Vol 199 (2) ◽  
pp. 328-335 ◽  
Author(s):  
Stuart J. Gilson ◽  
Andrew W. Fitzgibbon ◽  
Andrew Glennerster

2006 ◽  
Author(s):  
Frank L. Kooi ◽  
Piet Bijl ◽  
Pieter Padmos

2021 ◽  
Author(s):  
Polona Caserman ◽  
Augusto Garcia-Agundez ◽  
Alvar Gámez Zerban ◽  
Stefan Göbel

AbstractCybersickness (CS) is a term used to refer to symptoms, such as nausea, headache, and dizziness that users experience during or after virtual reality immersion. Initially discovered in flight simulators, commercial virtual reality (VR) head-mounted displays (HMD) of the current generation also seem to cause CS, albeit in a different manner and severity. The goal of this work is to summarize recent literature on CS with modern HMDs, to determine the specificities and profile of immersive VR-caused CS, and to provide an outlook for future research areas. A systematic review was performed on the databases IEEE Xplore, PubMed, ACM, and Scopus from 2013 to 2019 and 49 publications were selected. A summarized text states how different VR HMDs impact CS, how the nature of movement in VR HMDs contributes to CS, and how we can use biosensors to detect CS. The results of the meta-analysis show that although current-generation VR HMDs cause significantly less CS ($$p<0.001$$ p < 0.001 ), some symptoms remain as intense. Further results show that the nature of movement and, in particular, sensory mismatch as well as perceived motion have been the leading cause of CS. We suggest an outlook on future research, including the use of galvanic skin response to evaluate CS in combination with the golden standard (Simulator Sickness Questionnaire, SSQ) as well as an update on the subjective evaluation scores of the SSQ.


Sensors ◽  
2021 ◽  
Vol 21 (6) ◽  
pp. 2234
Author(s):  
Sebastian Kapp ◽  
Michael Barz ◽  
Sergey Mukhametov ◽  
Daniel Sonntag ◽  
Jochen Kuhn

Currently an increasing number of head mounted displays (HMD) for virtual and augmented reality (VR/AR) are equipped with integrated eye trackers. Use cases of these integrated eye trackers include rendering optimization and gaze-based user interaction. In addition, visual attention in VR and AR is interesting for applied research based on eye tracking in cognitive or educational sciences for example. While some research toolkits for VR already exist, only a few target AR scenarios. In this work, we present an open-source eye tracking toolkit for reliable gaze data acquisition in AR based on Unity 3D and the Microsoft HoloLens 2, as well as an R package for seamless data analysis. Furthermore, we evaluate the spatial accuracy and precision of the integrated eye tracker for fixation targets with different distances and angles to the user (n=21). On average, we found that gaze estimates are reported with an angular accuracy of 0.83 degrees and a precision of 0.27 degrees while the user is resting, which is on par with state-of-the-art mobile eye trackers.


Sign in / Sign up

Export Citation Format

Share Document