scholarly journals Reducing Cybersickness in 360-Degree Virtual Reality

2021 ◽  
pp. 1-17
Author(s):  
Iqra Arshad ◽  
Paulo De Mello ◽  
Martin Ender ◽  
Jason D. McEwen ◽  
Elisa R. Ferré

Abstract Despite the technological advancements in Virtual Reality (VR), users are constantly combating feelings of nausea and disorientation, the so-called cybersickness. Cybersickness symptoms cause severe discomfort and hinder the immersive VR experience. Here we investigated cybersickness in 360-degree head-mounted display VR. In traditional 360-degree VR experiences, translational movement in the real world is not reflected in the virtual world, and therefore self-motion information is not corroborated by matching visual and vestibular cues, which may trigger symptoms of cybersickness. We evaluated whether a new Artificial Intelligence (AI) software designed to supplement the 360-degree VR experience with artificial six-degrees-of-freedom motion may reduce cybersickness. Explicit (simulator sickness questionnaire and Fast Motion Sickness (FMS) rating) and implicit (heart rate) measurements were used to evaluate cybersickness symptoms during and after 360-degree VR exposure. Simulator sickness scores showed a significant reduction in feelings of nausea during the AI-supplemented six-degrees-of-freedom motion VR compared to traditional 360-degree VR. However, six-degrees-of-freedom motion VR did not reduce oculomotor or disorientation measures of sickness. No changes were observed in FMS and heart rate measures. Improving the congruency between visual and vestibular cues in 360-degree VR, as provided by the AI-supplemented six-degrees-of-freedom motion system considered, is essential for a more engaging, immersive and safe VR experience, which is critical for educational, cultural and entertainment applications.

2020 ◽  
Vol 33 (6) ◽  
pp. 625-644 ◽  
Author(s):  
Maria Gallagher ◽  
Reno Choi ◽  
Elisa Raffaella Ferrè

Abstract During exposure to Virtual Reality (VR) a sensory conflict may be present, whereby the visual system signals that the user is moving in a certain direction with a certain acceleration, while the vestibular system signals that the user is stationary. In order to reduce this conflict, the brain may down-weight vestibular signals, which may in turn affect vestibular contributions to self-motion perception. Here we investigated whether vestibular perceptual sensitivity is affected by VR exposure. Participants’ ability to detect artificial vestibular inputs was measured during optic flow or random motion stimuli on a VR head-mounted display. Sensitivity to vestibular signals was significantly reduced when optic flow stimuli were presented, but importantly this was only the case when both visual and vestibular cues conveyed information on the same plane of self-motion. Our results suggest that the brain dynamically adjusts the weight given to incoming sensory cues for self-motion in VR; however this is dependent on the congruency of visual and vestibular cues.


Electronics ◽  
2020 ◽  
Vol 9 (9) ◽  
pp. 1448 ◽  
Author(s):  
Youngwon Ryan Kim ◽  
Hyeonah Choi ◽  
Minwook Chang ◽  
Gerard J. Kim

Recently, a new breed of mobile virtual reality (dubbed as “EasyVR” in this work), has appeared in the form of conveniently clipping on a non-isolating magnifying lenses on the smartphone, still offering a reasonable level of immersion to using the isolated headset. Furthermore, such a form factor allows the fingers to touch the screen and select objects quite accurately, despite the finger(s) being seen unfocused over the lenses. Many navigation techniques have existed for both casual smartphone 3D applications using the touchscreen and immersive VR environments using the various controllers/sensors. However, no research has focused on the proper navigation interaction technique for a platform like EasyVR which necessitates the use of the touchscreen while holding the display device to the head and looking through the magnifying lenses. To design and propose the most fitting navigation method(s) with EasyVR, we mixed and matched the conventional touchscreen based and headset oriented navigation methods to come up with six viable navigation techniques—more specifically for selecting the travel direction and invoking the movement itself—including the use of head-rotation, on-screen keypads/buttons, one-touch teleport, drag-to-target, and finger gestures. These methods were experimentally compared for their basic usability and the level of immersion in navigating in 3D space with six degrees of freedom. The results provide a valuable guideline for designing/choosing the proper navigation method under different navigational needs of the given VR application.


Leonardo ◽  
2019 ◽  
Vol 52 (4) ◽  
pp. 349-356 ◽  
Author(s):  
Kris Layng ◽  
Ken Perlin ◽  
Sebastian Herscher ◽  
Corinne Brenner ◽  
Thomas Meduri

CAVE is a shared narrative six degrees of freedom (6DoF) virtual reality experience. In 3.5 days, 1,927 people attended its premiere at SIGGRAPH 2018. Thirty participants at a time each saw and heard the same narrative from their own individual location in the room, as they would when attending live theater. CAVE set out to disruptively change how audiences collectively experience immersive art and entertainment. Inspired by the social gatherings of theater and cinema, CAVE resonated with viewers in powerful and meaningful ways. Its specific pairing of colocated audiences and physically shared immersive narrative suggests a possible future path for shared cinematic experiences.


Author(s):  
Monica Bordegoni ◽  
Mario Covarrubias ◽  
Giandomenico Caruso ◽  
Umberto Cugini

This paper presents a novel system that allows product designers to design, experience, and modify new shapes of objects, starting from existing ones. The system allows designers to acquire and reconstruct the 3D model of a real object and to visualize and physically interact with this model. In addition, the system allows designer to modify the shape through physical manipulation of the 3D model and to eventually print it using a 3D printing technology. The system is developed by integrating state-of-the-art technologies in the sectors of reverse engineering, virtual reality, and haptic technology. The 3D model of an object is reconstructed by scanning its shape by means of a 3D scanning device. Then, the 3D model is imported into the virtual reality environment, which is used to render the 3D model of the object through an immersive head mounted display (HMD). The user can physically interact with the 3D model by using the desktop haptic strip for shape design (DHSSD), a 6 degrees of freedom servo-actuated developable metallic strip, which reproduces cross-sectional curves of 3D virtual objects. The DHSSD device is controlled by means of hand gestures recognized by a leap motion sensor.


2021 ◽  
Vol 10 (5) ◽  
pp. 3546-3551
Author(s):  
Tamanna Nurai

Cybersickness continues to become a negative consequence that degrades the interface for users of virtual worlds created for Virtual Reality (VR) users. There are various abnormalities that might cause quantifiable changes in body awareness when donning an Head Mounted Display (HMD) in a Virtual Environment (VE). VR headsets do provide VE that matches the actual world and allows users to have a range of experiences. Motion sickness and simulation sickness performance gives self-report assessments of cybersickness with VEs. In this study a simulator sickness questionnaire is being used to measure the aftereffects of the virtual environment. This research aims to answer if Immersive VR induce cybersickness and impact equilibrium coordination. The present research is formed as a cross-sectional observational analysis. According to the selection criteria, a total of 40 subjects would be recruited from AVBRH, Sawangi Meghe for the research. With intervention being used the experiment lasted 6 months. Simulator sickness questionnaire is used to evaluate the after-effects of a virtual environment. It holds a single period for measuring motion sickness and evaluation of equilibrium tests were done twice at exit and after 10 mins. Virtual reality being used in video games is still in its development. Integrating gameplay action into the VR experience will necessitate a significant amount of study and development. The study has evaluated if Immersive VR induce cybersickness and impact equilibrium coordination. To measure cybersickness, numerous scales have been developed. The essence of cybersickness has been revealed owing to work on motion sickness in a simulated system.


PLoS ONE ◽  
2021 ◽  
Vol 16 (7) ◽  
pp. e0254098
Author(s):  
Javier Marín-Morales ◽  
Juan Luis Higuera-Trujillo ◽  
Jaime Guixeres ◽  
Carmen Llinares ◽  
Mariano Alcañiz ◽  
...  

Many affective computing studies have developed automatic emotion recognition models, mostly using emotional images, audio and videos. In recent years, virtual reality (VR) has been also used as a method to elicit emotions in laboratory environments. However, there is still a need to analyse the validity of VR in order to extrapolate the results it produces and to assess the similarities and differences in physiological responses provoked by real and virtual environments. We investigated the cardiovascular oscillations of 60 participants during a free exploration of a real museum and its virtualisation viewed through a head-mounted display. The differences between the heart rate variability features in the high and low arousal stimuli conditions were analysed through statistical hypothesis testing; and automatic arousal recognition models were developed across the real and the virtual conditions using a support vector machine algorithm with recursive feature selection. The subjects’ self-assessments suggested that both museums elicited low and high arousal levels. In addition, the real museum showed differences in terms of cardiovascular responses, differences in vagal activity, while arousal recognition reached 72.92% accuracy. However, we did not find the same arousal-based autonomic nervous system change pattern during the virtual museum exploration. The results showed that, while the direct virtualisation of a real environment might be self-reported as evoking psychological arousal, it does not necessarily evoke the same cardiovascular changes as a real arousing elicitation. These contribute to the understanding of the use of VR in emotion recognition research; future research is needed to study arousal and emotion elicitation in immersive VR.


1998 ◽  
Vol 21 (6) ◽  
pp. 348-352 ◽  
Author(s):  
T. Yambe ◽  
S. Kobayashi ◽  
S. Nanka ◽  
M. Yoshizawa ◽  
K. Tabayashi ◽  
...  

For the development of the totally implantable artificial organs, it is an important problem to monitor the conditions of the implantable devices, especially when used in clinical cases. In this study, we used position sensors for the 3-dimensional (3-D) virtual reality (VR) system monitor an implantable artificial heart. The sensors used in the experiments were 3-space Fastrak (Polhemus, USA). The position sensors using electro-magnetic forces were attached to the inner actuating zone. Sensitivity of the position sensors was in the order of around 0.8 mm. By use of these VR position sensors, we could easily detect the six degrees of freedom as x,y,z, and pitch, yaw, roll of these sensors. Experimental evaluation using a model circulation loop and healthy adult goats was performed. Experimental results suggest that our newly developed implantable sensors for monitoring the implantable artificial heart system were useful for sensing driving condition, thus possibly useful for the implantable devices for clinical usage.


Sign in / Sign up

Export Citation Format

Share Document