scholarly journals Development and Calibration of an Eye-Tracking Fixation Identification Algorithm for Immersive Virtual Reality

Sensors ◽  
2020 ◽  
Vol 20 (17) ◽  
pp. 4956
Author(s):  
Jose Llanes-Jurado ◽  
Javier Marín-Morales ◽  
Jaime Guixeres ◽  
Mariano Alcañiz

Fixation identification is an essential task in the extraction of relevant information from gaze patterns; various algorithms are used in the identification process. However, the thresholds used in the algorithms greatly affect their sensitivity. Moreover, the application of these algorithm to eye-tracking technologies integrated into head-mounted displays, where the subject’s head position is unrestricted, is still an open issue. Therefore, the adaptation of eye-tracking algorithms and their thresholds to immersive virtual reality frameworks needs to be validated. This study presents the development of a dispersion-threshold identification algorithm applied to data obtained from an eye-tracking system integrated into a head-mounted display. Rules-based criteria are proposed to calibrate the thresholds of the algorithm through different features, such as number of fixations and the percentage of points which belong to a fixation. The results show that distance-dispersion thresholds between 1–1.6° and time windows between 0.25–0.4 s are the acceptable range parameters, with 1° and 0.25 s being the optimum. The work presents a calibrated algorithm to be applied in future experiments with eye-tracking integrated into head-mounted displays and guidelines for calibrating fixation identification algorithms

Author(s):  
Bin Li ◽  
Yun Zhang ◽  
Xiujuan Zheng ◽  
Xiaoping Huang ◽  
Sheng Zhang ◽  
...  

Author(s):  
Osama Halabi ◽  
Samir Abou El-Seoud ◽  
Jihad Alja'am ◽  
Hena Alpona ◽  
Moza Al-Hemadi ◽  
...  

Individuals with autism spectrum disorder (ASD) regularly experience situations in which they need to give answers but do not know how to respond; for example, questions related to everyday life activities that are asked by strangers. Research geared at utilizing technology to mend social and communication impairments in children with autism is actively underway. Immersive virtual reality (VR) is a relatively recent technology that has the potential of being an effective therapeutic tool for developing various skills in autistic children. This paper presents an interactive scenario-based VR system developed to improve the communications skills of autistic children. The system utilizes speech recognition to provide natural interaction and role-play and turn-taking to evaluate and verify the effectiveness of the immersive environment on the social performance of autistic children. In experiments conducted, participants showed more improved performance with a computer augmented virtual environment (CAVE) than with a head mounted display (HMD) or a normal desktop. The results indicate that immersive VR could be more satisfactory and motivational than desktop for children with ASD.


2012 ◽  
Vol 11 (3) ◽  
pp. 9-17 ◽  
Author(s):  
Sébastien Kuntz ◽  
Ján Cíger

A lot of professionals or hobbyists at home would like to create their own immersive virtual reality systems for cheap and taking little space. We offer two examples of such "home-made" systems using the cheapest hardware possible while maintaining a good level of immersion: the first system is based on a projector (VRKit-Wall) and cost around 1000$, while the second system is based on a head-mounted display (VRKit-HMD) and costs between 600� and 1000�. We also propose a standardization of those systems in order to enable simple application sharing. Finally, we describe a method to calibrate the stereoscopy of a NVIDIA 3D Vision system.


2021 ◽  
Author(s):  
Panagiotis Kourtesis ◽  
Simona Collina ◽  
Leonidas A. A. Doumas ◽  
Sarah E. MacPherson

There are major concerns about the suitability of immersive virtual reality (VR) systems (i.e., head-mounted display; HMD) to be implemented in research and clinical settings, because of the presence of nausea, dizziness, disorientation, fatigue, and instability (i.e., VR induced symptoms and effects; VRISE). Research suggests that the duration of a VR session modulates the presence and intensity of VRISE, but there are no suggestions regarding the appropriate maximum duration of VR sessions. The implementation of high-end VR HMDs in conjunction with ergonomic VR software seems to mitigate the presence of VRISE substantially. However, a brief tool does not currently exist to appraise and report both the quality of software features and VRISE intensity quantitatively. The Virtual Reality Neuroscience Questionnaire (VRNQ) was developed to assess the quality of VR software in terms of user experience, game mechanics, in-game assistance, and VRISE. Forty participants aged between 28 and 43 years were recruited (18 gamers and 22 non-gamers) for the study. They participated in 3 different VR sessions until they felt weary or discomfort and subsequently filled in the VRNQ. Our results demonstrated that VRNQ is a valid tool for assessing VR software as it has good convergent, discriminant, and construct validity. The maximum duration of VR sessions should be between 55 and 70 min when the VR software meets or exceeds the parsimonious cut-offs of the VRNQ and the users are familiarized with the VR system. Also, the gaming experience does not seem to affect how long VR sessions should last. Also, while the quality of VR software substantially modulates the maximum duration of VR sessions, age and education do not. Finally, deeper immersion, better quality of graphics and sound, and more helpful in-game instructions and prompts were found to reduce VRISE intensity. The VRNQ facilitates the brief assessment and reporting of the quality of VR software features and/or the intensity of VRISE, while its minimum and parsimonious cut-offs may appraise the suitability of VR software for implementation in research and clinical settings. The findings of this study contribute to the establishment of rigorous VR methods that are crucial for the viability of immersive VR as a research and clinical tool in cognitive neuroscience and neuropsychology.


Author(s):  
Thiago D'Angelo ◽  
Saul Emanuel Delabrida Silva ◽  
Ricardo A. R. Oliveira ◽  
Antonio A. F. Loureiro

Virtual Reality and Augmented Reality Head-Mounted Displays (HMDs) have been emerging in the last years. These technologies sound like the new hot topic for the next years. Head-Mounted Displays have been developed for many different purposes. Users have the opportunity to enjoy these technologies for entertainment, work tasks, and many other daily activities. Despite the recent release of many AR and VR HMDs, two major problems are hindering the AR HMDs from reaching the mainstream market: the extremely high costs and the user experience issues. In order to minimize these problems, we have developed an AR HMD prototype based on a smartphone and on other low-cost materials. The prototype is capable of running Eye Tracking algorithms, which can be used to improve user interaction and user experience. To assess our AR HMD prototype, we choose a state-of-the-art method for eye center location found in the literature and evaluate its real-time performance in different development boards.


Author(s):  
Nathan D. Darnall ◽  
Vinay Mishra ◽  
Sankar Jayaram ◽  
Uma Jayaram

Virtual reality (VR) technologies and systems have the potential to play a key role in assisting disabled inhabitants of smart home environments with instrumental activities of daily living (IADLs). While immersive environments have useful applications in the fields of gaming, simulation, and manufacturing, their capabilities have been largely untapped in smart home environments. We have developed an integrated CAD and virtual reality system which assists a smart home resident in locating and navigating to objects in the home. Using the methods presented in this paper, a room modeled in a CAD system is imported into a virtual environment, which is linked to an audio query-response interface. The user’s head and room objects are fitted with the sensors which are part of a six DOF motion tracking system. Methods have been created to allow the inhabitant to move objects around in the room and then later issue an audio query for the location of the object. The system generates an audio response with the object’s position relative to the person’s current position and orientation. As he approaches the object, information is derived from the virtual models of both the room and the objects within the room to provide better guidance. The ability of the VR-SMART system to guide a resident to an object was tested by mounting a head mounted display (HMD) on a user located in a room. This allowed the user to navigate through the virtual world that simulated the room he occupied, thereby providing a way to test the positional accuracy of the virtual system. Results of the testing in the immersive environment showed that although the overall system shows promise at a 30% success rate, the success of the system depends on the accuracy and calibration of the tracking system. In order to improve the success of the system, we explored the precision of a second motion capture system, with more accurate results. Results confirmed that the VR-SMART system could significantly improve the assistance of disabled people in finding objects easily in the room when implemented only as an assistive system without the head-mounted display.


i-Perception ◽  
2017 ◽  
Vol 8 (3) ◽  
pp. 204166951770820 ◽  
Author(s):  
Diederick C. Niehorster ◽  
Li Li ◽  
Markus Lappe

The advent of inexpensive consumer virtual reality equipment enables many more researchers to study perception with naturally moving observers. One such system, the HTC Vive, offers a large field-of-view, high-resolution head mounted display together with a room-scale tracking system for less than a thousand U.S. dollars. If the position and orientation tracking of this system is of sufficient accuracy and precision, it could be suitable for much research that is currently done with far more expensive systems. Here we present a quantitative test of the HTC Vive’s position and orientation tracking as well as its end-to-end system latency. We report that while the precision of the Vive’s tracking measurements is high and its system latency (22 ms) is low, its position and orientation measurements are provided in a coordinate system that is tilted with respect to the physical ground plane. Because large changes in offset were found whenever tracking was briefly lost, it cannot be corrected for with a one-time calibration procedure. We conclude that the varying offset between the virtual and the physical tracking space makes the HTC Vive at present unsuitable for scientific experiments that require accurate visual stimulation of self-motion through a virtual world. It may however be suited for other experiments that do not have this requirement.


2021 ◽  
Vol 33 (6) ◽  
pp. 799-806
Author(s):  
Dario Ogrizović ◽  
Ana Perić Hadžić ◽  
Mladen Jardas

With the increasing development and popularisation of information and communication technology, new challenges are posed to higher education in the modernisation of teaching in order to make education and training of students as effective as possible. It is therefore very important to develop and experiment with appropriate development tools, explore their benefits and effectiveness, and integrate them into existing learning strategies. The emergence of a computer-generated digital environment that can be directly experienced, actions that can determine what is happening in it, growth of technological characteristics, and decline in prices of virtual reality hardware leads to a situation that cannot be ignored. This paper investigated users' perceptions on the potential use of fully immersive virtual reality head-mounted displays in a discrete-event simulation of logistics processes. The dynamic nature of virtual environments requires active participation which causes greater engagement, motivation, and interest aided by interaction and challenges.


Author(s):  
Hugo C. Gomez-Tone ◽  
Jorge Martin-Gutierrez ◽  
John Bustamante-Escapa ◽  
Paola Bustamante-Escapa ◽  
Betty K. Valencia-Anci

To design architectural spaces that not only respond to the basic needs of users, but also seek their emotional well-being, it is necessary for the architecture students to have a special sensitivity and be aware of the different sensations that their designs should and can evoke. To achieve this competence without exploring real spaces, Immersive Virtual Reality technology offers an important contribution to the field of architecture. The purpose of this research is to determine if the sensations perceived in virtual architectural spaces by students are similar to the real ones and to determine the characteristics of this technology that allow a better perception of sensations. Six architectural modules were designed to be walked through and experienced at real scale using a Head Mounted Display by 22 students of the first and fifth year of studies of Architecture career in Peru. An ad-hoc questionnaire allowed to know the perceived sensations and the benefits of the tool. The results obtained showed that the perception of sensations of the fifth year students is a little closer to those expressed by a group of seven experts compared to that of the first year students and that the students consider the characteristics of accessibility, real scale of the space and the possibility of going through and looking at the space in all directions are those that have given more realism to the experience and therefore better perception of the space, while the characteristics of natural light and shadows, construction materials and external environment have been less valued in the realism of the experience. It is concluded that the sensory experimentation in architectural spaces modelled realistically in virtual environments allows the perception of sensations very similar to those that the architect seeks to convey initially.


Sign in / Sign up

Export Citation Format

Share Document