An eye movement tracking type head mounted display for virtual reality system: evaluation experiments of a prototype system

Author(s):  
K. Iwamoto ◽  
S. Katsumata ◽  
K. Tanie
2021 ◽  
Vol 1802 (4) ◽  
pp. 042066
Author(s):  
Zhaowei Li ◽  
Peiyuan Guo ◽  
Chen Song

2003 ◽  
Vol 56 (6) ◽  
pp. 1053-1077 ◽  
Author(s):  
Linden J. Ball ◽  
Erica J. Lucas ◽  
Jeremy N. V. Miles ◽  
Alastair G. Gale

Three experiments are reported that used eye-movement tracking to investigate the inspection-time effect predicted by Evans’ (1996) heuristic-analytic account of the Wason selection task. Evans’ account proposes that card selections are based on the operation of relevance-determining heuristics, whilst analytic processing only rationalizes selections. As such, longer inspection times should be associated with selected cards (which are subjected to rationalization) than with rejected cards. Evidence for this effect has been provided by Evans (1996) using computer- presented selection tasks and instructions for participants to indicate (with a mouse pointer) cards under consideration. Roberts (1998b) has argued that mouse pointing gives rise to artefactual support for Evans’ predictions because of biases associated with the task format and the use of mouse pointing. We eradicated all sources of artefact by combining careful task constructions with eye-movement tracking to measure directly on-line attentional processing. All three experiments produced good evidence for the robustness of the inspection-time effect, supporting the predictions of the heuristic-analytic account.


Author(s):  
Nathan D. Darnall ◽  
Vinay Mishra ◽  
Sankar Jayaram ◽  
Uma Jayaram

Virtual reality (VR) technologies and systems have the potential to play a key role in assisting disabled inhabitants of smart home environments with instrumental activities of daily living (IADLs). While immersive environments have useful applications in the fields of gaming, simulation, and manufacturing, their capabilities have been largely untapped in smart home environments. We have developed an integrated CAD and virtual reality system which assists a smart home resident in locating and navigating to objects in the home. Using the methods presented in this paper, a room modeled in a CAD system is imported into a virtual environment, which is linked to an audio query-response interface. The user’s head and room objects are fitted with the sensors which are part of a six DOF motion tracking system. Methods have been created to allow the inhabitant to move objects around in the room and then later issue an audio query for the location of the object. The system generates an audio response with the object’s position relative to the person’s current position and orientation. As he approaches the object, information is derived from the virtual models of both the room and the objects within the room to provide better guidance. The ability of the VR-SMART system to guide a resident to an object was tested by mounting a head mounted display (HMD) on a user located in a room. This allowed the user to navigate through the virtual world that simulated the room he occupied, thereby providing a way to test the positional accuracy of the virtual system. Results of the testing in the immersive environment showed that although the overall system shows promise at a 30% success rate, the success of the system depends on the accuracy and calibration of the tracking system. In order to improve the success of the system, we explored the precision of a second motion capture system, with more accurate results. Results confirmed that the VR-SMART system could significantly improve the assistance of disabled people in finding objects easily in the room when implemented only as an assistive system without the head-mounted display.


2018 ◽  
Vol 18 (6) ◽  
pp. 2592-2598 ◽  
Author(s):  
Zheng-Nan Zhao ◽  
Ju Lin ◽  
Jie Zhang ◽  
Yang Yu ◽  
Bo Yuan ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document