Simulation Sickness Evaluation While Using a Fully Autonomous Car in a Head Mounted Display Virtual Environment

Author(s):  
Stanislava Rangelova ◽  
Daniel Decker ◽  
Marc Eckel ◽  
Elisabeth Andre
Sensors ◽  
2021 ◽  
Vol 21 (2) ◽  
pp. 397
Author(s):  
Qimeng Zhang ◽  
Ji-Su Ban ◽  
Mingyu Kim ◽  
Hae Won Byun ◽  
Chang-Hun Kim

We propose a low-asymmetry interface to improve the presence of non-head-mounted-display (non-HMD) users in shared virtual reality (VR) experiences with HMD users. The low-asymmetry interface ensures that the HMD and non-HMD users’ perception of the VR environment is almost similar. That is, the point-of-view asymmetry and behavior asymmetry between HMD and non-HMD users are reduced. Our system comprises a portable mobile device as a visual display to provide a changing PoV for the non-HMD user and a walking simulator as an in-place walking detection sensor to enable the same level of realistic and unrestricted physical-walking-based locomotion for all users. Because this allows non-HMD users to experience the same level of visualization and free movement as HMD users, both of them can engage as the main actors in movement scenarios. Our user study revealed that the low-asymmetry interface enables non-HMD users to feel a presence similar to that of the HMD users when performing equivalent locomotion tasks in a virtual environment. Furthermore, our system can enable one HMD user and multiple non-HMD users to participate together in a virtual world; moreover, our experiments show that the non-HMD user satisfaction increases with the number of non-HMD participants owing to increased presence and enjoyment.


Ergonomics ◽  
1996 ◽  
Vol 39 (11) ◽  
pp. 1370-1380 ◽  
Author(s):  
TETSUO KAWARA ◽  
MASAO OHMI ◽  
TATSUYA YOSHIZAWA

2021 ◽  
Author(s):  
Silvia Arias ◽  
Axel Mossberg ◽  
Daniel Nilsson ◽  
Jonathan Wahlqvist

AbstractComparing results obtained in Virtual Reality to those obtained in physical experiments is key for validation of Virtual Reality as a research method in the field of Human Behavior in Fire. A series of experiments based on similar evacuation scenarios in a high-rise building with evacuation elevators was conducted. The experiments consisted of a physical experiment in a building, and two Virtual Reality experiments in a virtual representation of the same building: one using a Cave Automatic Virtual Environment (CAVE), and one using a head-mounted display (HMD). The data obtained in the HMD experiment is compared to data obtained in the CAVE and physical experiment. The three datasets were compared in terms of pre-evacuation time, noticing escape routes, walking paths, exit choice, waiting times for the elevators and eye-tracking data related to emergency signage. The HMD experiment was able to reproduce the data obtained in the physical experiment in terms of pre-evacuation time and exit choice, but there were large differences with the results from the CAVE experiment. Possible factors affecting the data produced using Virtual Reality are identified, such as spatial orientation and movement in the virtual environment.


2021 ◽  
Vol 10 (5) ◽  
pp. 3546-3551
Author(s):  
Tamanna Nurai

Cybersickness continues to become a negative consequence that degrades the interface for users of virtual worlds created for Virtual Reality (VR) users. There are various abnormalities that might cause quantifiable changes in body awareness when donning an Head Mounted Display (HMD) in a Virtual Environment (VE). VR headsets do provide VE that matches the actual world and allows users to have a range of experiences. Motion sickness and simulation sickness performance gives self-report assessments of cybersickness with VEs. In this study a simulator sickness questionnaire is being used to measure the aftereffects of the virtual environment. This research aims to answer if Immersive VR induce cybersickness and impact equilibrium coordination. The present research is formed as a cross-sectional observational analysis. According to the selection criteria, a total of 40 subjects would be recruited from AVBRH, Sawangi Meghe for the research. With intervention being used the experiment lasted 6 months. Simulator sickness questionnaire is used to evaluate the after-effects of a virtual environment. It holds a single period for measuring motion sickness and evaluation of equilibrium tests were done twice at exit and after 10 mins. Virtual reality being used in video games is still in its development. Integrating gameplay action into the VR experience will necessitate a significant amount of study and development. The study has evaluated if Immersive VR induce cybersickness and impact equilibrium coordination. To measure cybersickness, numerous scales have been developed. The essence of cybersickness has been revealed owing to work on motion sickness in a simulated system.


2016 ◽  
Vol 48 ◽  
pp. 261-266 ◽  
Author(s):  
Maxime T. Robert ◽  
Laurent Ballaz ◽  
Martin Lemay

2020 ◽  
Vol 2020 (9) ◽  
pp. 288-1-288-8 ◽  
Author(s):  
Anjali K. Jogeshwar ◽  
Gabriel J. Diaz ◽  
Susan P. Farnand ◽  
Jeff B. Pelz

Eye tracking is used by psychologists, neurologists, vision researchers, and many others to understand the nuances of the human visual system, and to provide insight into a person’s allocation of attention across the visual environment. When tracking the gaze behavior of an observer immersed in a virtual environment displayed on a head-mounted display, estimated gaze direction is encoded as a three-dimensional vector extending from the estimated location of the eyes into the 3D virtual environment. Additional computation is required to detect the target object at which gaze was directed. These methods must be robust to calibration error or eye tracker noise, which may cause the gaze vector to miss the target object and hit an incorrect object at a different distance. Thus, the straightforward solution involving a single vector-to-object collision could be inaccurate in indicating object gaze. More involved metrics that rely upon an estimation of the angular distance from the ray to the center of the object must account for an object’s angular size based on distance, or irregularly shaped edges - information that is not made readily available by popular game engines (e.g. Unity© /Unreal© ) or rendering pipelines (OpenGL). The approach presented here avoids this limitation by projecting many rays distributed across an angular space that is centered upon the estimated gaze direction.


2021 ◽  
Author(s):  
S. Pastel ◽  
D. Bürger ◽  
C. H. Chen ◽  
K. Petri ◽  
K. Witte

AbstractVirtual reality (VR) is a promising tool and is increasingly used in many different fields, in which virtual walking can be generalized through detailed modeling of the physical environment such as in sports science, medicine and furthermore. However, the visualization of a virtual environment using a head-mounted display (HMD) differs compared to reality, and it is still not clear whether the visual perception works equally within VR. The purpose of the current study is to compare the spatial orientation between real world (RW) and VR. Therefore, the participants had to walk blindfolded to different placed objects in a real and virtual environment, which did not differ in physical properties. They were equipped with passive markers to track the position of the back of their hand, which was used to specify each object’s location. The first task was to walk blindfolded from one starting position to different placed sport-specific objects requiring different degrees of rotation after observing them for 15 s (0°, 45°, 180°, and 225°). The three-way ANOVA with repeated measurements indicated no significant difference between RW and VR within the different degrees of rotation (p > 0.05). In addition, the participants were asked to walk blindfolded three times from a new starting position to two objects, which were ordered differently during the conditions. Except for one case, no significant differences in the pathways between RW and VR were found (p > 0.05). This study supports that the use of VR ensures similar behavior of the participants compared to real-world interactions and its authorization of use.


2009 ◽  
Vol 18 (3) ◽  
pp. 185-199 ◽  
Author(s):  
Joel Jordan ◽  
Mel Slater

A sign of presence in virtual environments is that people respond to situations and events as if they were real, where response may be considered at many different levels, ranging from unconscious physiological responses through to overt behavior, emotions, and thoughts. In this paper we consider two responses that gave different indications of the onset of presence in a gradually forming environment. Two aspects of the response of people to an immersive virtual environment were recorded: their eye scanpath, and their skin conductance response (SCR). The scenario was formed over a period of 2 min, by introducing an increasing number of its polygons in random order in a head-tracked head-mounted display. For one group of experimental participants (n = 8) the environment formed into one in which they found themselves standing on top of a 3 m high column. For a second group of participants (n = 6) the environment was otherwise the same except that the column was only 1 cm high, so that they would be standing at normal ground level. For a third group of participants (n = 14) the polygons never formed into a meaningful environment. The participants who stood on top of the tall column exhibited a significant decrease in entropy of the eye scanpath and an increase in the number of SCR by 99 s into the scenario, at a time when only 65% of the polygons had been displayed. The ground level participants exhibited a similar decrease in scanpath entropy, but not the increase in SCR. The random scenario grouping did not exhibit this decrease in eye scanpath entropy. A drop in scanpath entropy indicates that the environment had cohered into a meaningful perception. An increase in the rate of SCR indicates the perception of an aversive stimulus. These results suggest that on these two dimensions (scanpath entropy and rate of SCR) participants were responding realistically to the scenario shown in the virtual environment. In addition, the response occurred well before the entire scenario had been displayed, suggesting that once a set of minimal cues exists within a scenario, it is enough to form a meaningful perception. Moreover, at the level of the sympathetic nervous system, the participants who were standing on top of the column exhibited arousal as if their experience might be real. This is an important practical aspect of the concept of presence.


2021 ◽  
Vol 9 (3) ◽  
pp. 1504-1513
Author(s):  
Muhammad Azam ◽  
Asif Ali ◽  
Saddam Akbar ◽  
Marrium Bashir ◽  
Hyun Chae Chung

Purpose of the study: The aim of this paper was to study gender differences regarding their perceptual judgment and movement behavior in the road crossing task. Methodology: A simulated road crossing environment outside the Human Motor Behavior laboratory (HMBL) was used to examine the individuals’ perceptual-motor behavior. Twenty-four young adults performed the road crossing task in the virtual environment judging whether the available gap was crossable or not crossable and then initiating movement depending on the perceptual information. Main Findings: Participants’ gap selection revealed that their cross-ability was influenced by vehicle speed, however, female participants made more errors relative to males. In addition, females took longer to cross and made unnecessary adjustments during crossings. The study findings suggest that females’ erroneous perceptual decisions and inconsistent locomotion behavior in road-crossing put them at higher risk relative to their male counterparts. Application of this study: The findings of this study may apply to developing training programs regarding pedestrian individuals. Training with performing road-crossing tasks may prove to be helpful for refining individuals’ perceptual judgment and movement behavior to minimize chances of accidents in road crossing. Specifically, having experience with the road-crossing task in a virtual environment may reduce the tendency towards risk-taking behavior. The novelty of this study: Most of the past research regarding pedestrian individuals’ road crossing behavior examined participants’ perceptual judgment (perception) in standing position only or did not analyze movement behavior in the actual walking set up. The approach utilized in our experiment was novel in this regard; individuals can choose to cross a gap and walk wearing a head-mounted display.


2021 ◽  
Vol 2 ◽  
Author(s):  
Juno Kim ◽  
Stephen Palmisano ◽  
Wilson Luu ◽  
Shinichi Iwasaki

Humans rely on multiple senses to perceive their self-motion in the real world. For example, a sideways linear head translation can be sensed either by lamellar optic flow of the visual scene projected on the retina of the eye or by stimulation of vestibular hair cell receptors found in the otolith macula of the inner ear. Mismatches in visual and vestibular information can induce cybersickness during head-mounted display (HMD) based virtual reality (VR). In this pilot study, participants were immersed in a virtual environment using two recent consumer-grade HMDs: the Oculus Go (3DOF angular only head tracking) and the Oculus Quest (6DOF angular and linear head tracking). On each trial they generated horizontal linear head oscillations along the interaural axis at a rate of 0.5 Hz. This head movement should generate greater sensory conflict when viewing the virtual environment on the Oculus Go (compared to the Quest) due to the absence of linear tracking. We found that perceived scene instability always increased with the degree of linear visual-vestibular conflict. However, cybersickness was not experienced by 7/14 participants, but was experienced by the remaining participants in at least one of the stereoscopic viewing conditions (six of whom also reported cybersickness in monoscopic viewing conditions). No statistical difference in spatial presence was found across conditions, suggesting that participants could tolerate considerable scene instability while retaining the feeling of being there in the virtual environment. Levels of perceived scene instability, spatial presence and cybersickness were found to be similar between the Oculus Go and the Oculus Quest with linear tracking disabled. The limited effect of linear coupling on cybersickness, compared with its strong effect on perceived scene instability, suggests that perceived scene instability may not always be associated with cybersickness. However, perceived scene instability does appear to provide explanatory power over the cybersickness observed in stereoscopic viewing conditions.


Sign in / Sign up

Export Citation Format

Share Document