scholarly journals Presence Effects in Virtual Reality Based on User Characteristics: Attention, Enjoyment, and Memory

Electronics ◽  
2021 ◽  
Vol 10 (9) ◽  
pp. 1051
Author(s):  
Si Jung Kim ◽  
Teemu H. Laine ◽  
Hae Jung Suk

Presence refers to the emotional state of users where their motivation for thinking and acting arises based on the perception of the entities in a virtual world. The immersion level of users can vary when they interact with different media content, which may result in different levels of presence especially in a virtual reality (VR) environment. This study investigates how user characteristics, such as gender, immersion level, and emotional valence on VR, are related to the three elements of presence effects (attention, enjoyment, and memory). A VR story was created and used as an immersive stimulus in an experiment, which was presented through a head-mounted display (HMD) equipped with an eye tracker that collected the participants’ eye gaze data during the experiment. A total of 53 university students (26 females, 27 males), with an age range from 20 to 29 years old (mean 23.8), participated in the experiment. A set of pre- and post-questionnaires were used as a subjective measure to support the evidence of relationships among the presence effects and user characteristics. The results showed that user characteristics, such as gender, immersion level, and emotional valence, affected their level of presence, however, there is no evidence that attention is associated with enjoyment or memory.

Sensors ◽  
2021 ◽  
Vol 21 (14) ◽  
pp. 4663
Author(s):  
Janaina Cavalcanti ◽  
Victor Valls ◽  
Manuel Contero ◽  
David Fonseca

An effective warning attracts attention, elicits knowledge, and enables compliance behavior. Game mechanics, which are directly linked to human desires, stand out as training, evaluation, and improvement tools. Immersive virtual reality (VR) facilitates training without risk to participants, evaluates the impact of an incorrect action/decision, and creates a smart training environment. The present study analyzes the user experience in a gamified virtual environment of risks using the HTC Vive head-mounted display. The game was developed in the Unreal game engine and consisted of a walk-through maze composed of evident dangers and different signaling variables while user action data were recorded. To demonstrate which aspects provide better interaction, experience, perception and memory, three different warning configurations (dynamic, static and smart) and two different levels of danger (low and high) were presented. To properly assess the impact of the experience, we conducted a survey about personality and knowledge before and after using the game. We proceeded with the qualitative approach by using questions in a bipolar laddering assessment that was compared with the recorded data during the game. The findings indicate that when users are engaged in VR, they tend to test the consequences of their actions rather than maintaining safety. The results also reveal that textual signal variables are not accessed when users are faced with the stress factor of time. Progress is needed in implementing new technologies for warnings and advance notifications to improve the evaluation of human behavior in virtual environments of high-risk surroundings.


2020 ◽  
Vol 10 (5) ◽  
pp. 1668 ◽  
Author(s):  
Pavan Kumar B. N. ◽  
Adithya Balasubramanyam ◽  
Ashok Kumar Patil ◽  
Chethana B. ◽  
Young Ho Chai

Over the years, gaze input modality has been an easy and demanding human–computer interaction (HCI) method for various applications. The research of gaze-based interactive applications has advanced considerably, as HCIs are no longer constrained to traditional input devices. In this paper, we propose a novel immersive eye-gaze-guided camera (called GazeGuide) that can seamlessly control the movements of a camera mounted on an unmanned aerial vehicle (UAV) from the eye-gaze of a remote user. The video stream captured by the camera is fed into a head-mounted display (HMD) with a binocular eye tracker. The user’s eye-gaze is the sole input modality to maneuver the camera. A user study was conducted considering the static and moving targets of interest in a three-dimensional (3D) space to evaluate the proposed framework. GazeGuide was compared with a state-of-the-art input modality remote controller. The qualitative and quantitative results showed that the proposed GazeGuide performed significantly better than the remote controller.


2020 ◽  
Vol 2020 (9) ◽  
pp. 39-1-39-7
Author(s):  
Mingming Wang ◽  
Anjali Jogeshwar ◽  
Gabriel J. Diaz ◽  
Jeff B. Pelz ◽  
Susan Farnand

A virtual reality (VR) driving simulation platform has been built for use in addressing multiple research interests. This platform is a VR 3D engine (Unity © ) that provides an immersive driving experience viewed in an HTC Vive © head-mounted display (HMD). To test this platform, we designed a virtual driving scenario based on a real tunnel used by Törnros to perform onroad tests [1] . Data from the platform, including driving speed and lateral lane position, was compared the published on-road tests. The correspondence between the driving simulation and onroad tests is assessed to demonstrate the ability of our platform as a research tool. In addition, the drivers’ eye movement data, such as 3D gaze point of regard (POR), will be collected during the test with an Tobii © eye-tracker integrated in the HMD. The data set will be analyzed offline and examined for correlations with driving behaviors in future study.


Author(s):  
Eunhee Chang ◽  
Hyun Taek Kim ◽  
Byounghyun Yoo

Abstract Cybersickness refers to a group of uncomfortable symptoms experienced in virtual reality (VR). Among several theories of cybersickness, the subjective vertical mismatch (SVM) theory focuses on an individual’s internal model, which is created and updated through past experiences. Although previous studies have attempted to provide experimental evidence for the theory, most approaches are limited to subjective measures or body sway. In this study, we aimed to demonstrate the SVM theory on the basis of the participant’s eye movements and investigate whether the subjective level of cybersickness can be predicted using eye-related measures. 26 participants experienced roller coaster VR while wearing a head-mounted display with eye tracking. We designed four experimental conditions by changing the orientation of the VR scene (upright vs. inverted) or the controllability of the participant’s body (unrestrained vs. restrained body). The results indicated that participants reported more severe cybersickness when experiencing the upright VR content without controllability. Moreover, distinctive eye movements (e.g. fixation duration and distance between the eye gaze and the object position sequence) were observed according to the experimental conditions. On the basis of these results, we developed a regression model using eye-movement features and found that our model can explain 34.8% of the total variance of cybersickness, indicating a substantial improvement compared to the previous work (4.2%). This study provides empirical data for the SVM theory using both subjective and eye-related measures. In particular, the results suggest that participants’ eye movements can serve as a significant index for predicting cybersickness when considering natural gaze behaviors during a VR experience.


Sensors ◽  
2020 ◽  
Vol 20 (12) ◽  
pp. 3565
Author(s):  
Kevin A. Hernandez-Ossa ◽  
Eduardo H. Montenegro-Couto ◽  
Berthil Longo ◽  
Alexandre Bissoli ◽  
Mariana M. Sime ◽  
...  

For some people with severe physical disabilities, the main assistive device to improve their independence and to enhance overall well-being is an electric-powered wheelchair (EPW). However, there is a necessity to offer users EPW training. In this work, the Simcadrom is introduced, which is a virtual reality simulator for EPW driving learning purposes, testing of driving skills and performance, and testing of input interfaces. This simulator uses a joystick as the main input interface, and a virtual reality head-mounted display. However, it can also be used with an eye-tracker device as an alternative input interface and a projector to display the virtual environment (VE). Sense of presence, and user experience questionnaires were implemented to evaluate this version of the Simcadrom in addition to some statistical tests for performance parameters like: total elapsed time, path following error, and total number of commands. A test protocol was proposed and, considering the overall results, the system proved to simulate, very realistically, the usability, kinematics, and dynamics of a real EPW in a VE. Most subjects were able to improve their EPW driving performance in the training session. Furthermore, all skills learned are feasible to be transferred to a real EPW.


2020 ◽  
Author(s):  
Alexandra Sipatchin ◽  
Siegfried Wahl ◽  
Katharina Rifai

AbstractBackgroundAdding an eye tracker inside a head-mounted display (HMD) can offer a variety of novel functions in virtual reality (VR). Promising results point towards its usability as a flexible and interactive tool for low vision assessments and research of low vision functional impairment. Visual field (VF) perimetry performed using VR methodologies evidenced a correlation between the reliability of visual field testing in VR and the Humphrey test. The simulation of visual loss in VR is a powerful method used to investigate the impact and the adaptation to visual diseases. The present study presents a preliminary assessment of the HTC Vive Pro Eye for its potential use for these applications.MethodsWe investigated data quality over a wide visual field and tested the effect of head motion. An objective direct end-to-end temporal precision test simulated two different scenarios: the appearance of a pupil inside the eye tracker and a shift in pupil position, known as artificial saccade generator. The technique is low-cost thanks to a Raspberry Pi system and automatic.ResultsThe target position on the screen and the head movement limit the HTC Vive Pro Eye’s usability. All the simulated scenarios showed a system’s latency of 58.1 milliseconds (ms).ConclusionThese results point towards limitations and improvements of the HTC Vive Pro Eye’s status quo for visual loss simulation scenarios and visual perimetry testing.


Author(s):  
Konstantin Ryabinin ◽  
Konstantin Belousov ◽  
Roman Chumakov

This paper is devoted to extending the previously created unified pipeline for conducting eye-tracking- based experiments in a virtual reality environment. In the previous work, we proposed using SciVi semantic data mining platform, Unreal Engine and HTC Vive Pro Eye head-mounted display to study reading process in the immersive virtual reality. The currently proposed extension enables to handle so-called polycode stimuli: compound visual objects, which consist of individual parts carrying different semantics for the viewer. To segment polycode stimuli extracting areas of interest (areas, where the informant’s eye gaze is being tracked) we adopt Creative Maps Studio vector graphics editor. To integrate Creative Maps Studio into the existing pipeline we created plugins for SciVi platform to load and handle the segmented stimuli, place them in the virtual reality scenes, collect corresponding eye gaze tracking data and perform visual analysis of the data collected. To analyze the eye gaze tracks, we utilize a circular graph that allows comprehensive visualization of hierarchical areas of interest (mapping them to color- coded graph nodes grouped into the hierarchy with a help of multilevel circular scale) and corresponding eye movements (mapped to the graph edges). We tested our pipeline on two different stimuli: the advertising poster and the painting “The Appearance of Christ Before the People” by A. Ivanov (1857).


2021 ◽  
Vol 11 (7) ◽  
pp. 3090
Author(s):  
Sangwook Yoo ◽  
Cheongho Lee ◽  
Seongah Chin

To experience a real soap bubble show, materials and tools are required, as are skilled performers who produce the show. However, in a virtual space where spatial and temporal constraints do not exist, bubble art can be performed without real materials and tools to give a sense of immersion. For this, the realistic expression of soap bubbles is an interesting topic for virtual reality (VR). However, the current performance of VR soap bubbles is not satisfying the high expectations of users. Therefore, in this study, we propose a physically based approach for reproducing the shape of the bubble by calculating the measured parameters required for bubble modeling and the physical motion of bubbles. In addition, we applied the change in the flow of the surface of the soap bubble measured in practice to the VR rendering. To improve users’ VR experience, we propose that they should experience a bubble show in a VR HMD (Head Mounted Display) environment.


Sign in / Sign up

Export Citation Format

Share Document