scholarly journals Configuring Virtual Reality Displays in a Mixed-Reality Environment for LVC Training

Author(s):  
Brandon J. Newendorp ◽  
Christian Noon ◽  
Joe Holub ◽  
Eliot H. Winer ◽  
Stephen Gilbert ◽  
...  

In order to adapt to an ever-changing set of threats, military forces need to find new methods of training. The prevalence of commercial game engines combined with virtual reality (VR) and mixed reality environments can prove beneficial to training. Live, virtual and constructive (LVC) training combines live people, virtual environments and simulated actors to create a better training environment. However, integrating virtual reality displays, software simulations and artificial weapons into a mixed reality environment poses numerous challenges. A mixed reality environment known as The Veldt was constructed to research these challenges. The Veldt consists of numerous independent displays, along with movable walls, doors and windows. This allows The Veldt to simulate numerous training scenarios. Several challenges were encountered in creating this system. Displays were precisely located using the tracking system, then configured using VR Juggler. The ideal viewpoint for each display was configured based on the expect location for users to be looking at it. Finally, the displays were accurately aligned to the virtual terrain model. This paper describes how the displays were configured in The Veldt, as well as how it was used for two training scenarios.

Author(s):  
S Leinster-Evans ◽  
J Newell ◽  
S Luck

This paper looks to expand on the INEC 2016 paper ‘The future role of virtual reality within warship support solutions for the Queen Elizabeth Class aircraft carriers’ presented by Ross Basketter, Craig Birchmore and Abbi Fisher from BAE Systems in May 2016 and the EAAW VII paper ‘Testing the boundaries of virtual reality within ship support’ presented by John Newell from BAE Systems and Simon Luck from BMT DSL in June 2017. BAE Systems and BMT have developed a 3D walkthrough training system that supports the teams working closely with the QEC Aircraft Carriers in Portsmouth and this work was presented at EAAW VII. Since then this work has been extended to demonstrate the art of the possible on Type 26. This latter piece of work is designed to explore the role of 3D immersive environments in the development and fielding of support and training solutions, across the range of support disciplines. The combined team are looking at how this digital thread leads from design of platforms, both surface and subsurface, through build into in-service support and training. This rich data and ways in which it could be used in the whole lifecycle of the ship, from design and development (used for spatial acceptance, HazID, etc) all the way through to operational support and maintenance (in conjunction with big data coming off from the ship coupled with digital tech docs for maintenance procedures) using constantly developing technologies such as 3D, Virtual Reality, Augmented Reality and Mixed Reality, will be proposed.  The drive towards gamification in the training environment to keep younger recruits interested and shortening course lengths will be explored. The paper develops the options and looks to how this technology can be used and where the value proposition lies. 


Author(s):  
Randall Spain ◽  
Benjamin Goldberg ◽  
Jeffrey Hansberger ◽  
Tami Griffith ◽  
Jeremy Flynn ◽  
...  

Recent advances in technology have made virtual environments, virtual reality, augmented reality, and simulations more affordable and accessible to researchers, companies, and the general public, which has led to many novel use cases and applications. A key objective of human factors research and practice is determining how these technology-rich applications can be designed and applied to improve human performance across a variety of contexts. This session will demonstrate some of the distinct and diverse uses of virtual environments and mixed reality environments in an alternative format. The session will begin with each demonstrator providing a brief overview of their virtual environment (VE) and a description of how it has been used to address a particular problem or research need. Following the description portion of the session, each VE will be set-up at a demonstration station in the room, and session attendees will be encouraged to directly interact with the virtual environment and ask demonstrators questions about their research and inquire about the effectiveness of using VE for research, training, and evaluation purposes. The overall objective of this alternative session is to increase the awareness of how human factors professionals use VE technologies and increase the awareness of the capabilities and limitations of VE in supporting the work of HF professionals.


Sensors ◽  
2021 ◽  
Vol 21 (12) ◽  
pp. 4006
Author(s):  
Razeen Hussain ◽  
Manuela Chessa ◽  
Fabio Solari

Cybersickness is one of the major roadblocks in the widespread adoption of mixed reality devices. Prolonged exposure to these devices, especially virtual reality devices, can cause users to feel discomfort and nausea, spoiling the immersive experience. Incorporating spatial blur in stereoscopic 3D stimuli has shown to reduce cybersickness. In this paper, we develop a technique to incorporate spatial blur in VR systems inspired by the human physiological system. The technique makes use of concepts from foveated imaging and depth-of-field. The developed technique can be applied to any eye tracker equipped VR system as a post-processing step to provide an artifact-free scene. We verify the usefulness of the proposed system by conducting a user study on cybersickness evaluation. We used a custom-built rollercoaster VR environment developed in Unity and an HTC Vive Pro Eye headset to interact with the user. A Simulator Sickness Questionnaire was used to measure the induced sickness while gaze and heart rate data were recorded for quantitative analysis. The experimental analysis highlighted the aptness of our foveated depth-of-field effect in reducing cybersickness in virtual environments by reducing the sickness scores by approximately 66%.


2019 ◽  
Vol 2019 ◽  
pp. 1-10 ◽  
Author(s):  
Zhibao Qin ◽  
Yonghang Tai ◽  
Chengqi Xia ◽  
Jun Peng ◽  
Xiaoqiao Huang ◽  
...  

The aim of this study is to develop and assess the peg transfer training module face, content and construct validation use of the box, virtual reality (VR), cognitive virtual reality (CVR), augmented reality (AR), and mixed reality (MR) trainer, thereby to compare advantages and disadvantages of these simulators. Training system (VatsSim-XR) design includes customized haptic-enabled thoracoscopic instruments, virtual reality helmet set, endoscope kit with navigation, and the patient-specific corresponding training environment. A cohort of 32 trainees comprising 24 novices and 8 experts underwent the real and virtual simulators that were conducted in the department of thoracic surgery of Yunnan First People’s Hospital. Both subjective and objective evaluations have been developed to explore the visual and haptic potential promotions in peg transfer education. Experiments and evaluation results conducted by both professional and novice thoracic surgeons show that the surgery skills from experts are better than novices overall, AR trainer is able to provide a more balanced training environments on visuohaptic fidelity and accuracy, box trainer and MR trainer demonstrated the best realism 3D perception and surgical immersive performance, respectively, and CVR trainer shows a better clinic effect that the traditional VR trainer. Combining these in a systematic approach, tuned with specific fidelity requirements, medical simulation systems would be able to provide a more immersive and effective training environment.


Author(s):  
Randall Spain ◽  
Benjamin Goldberg ◽  
Pete Khooshabeh ◽  
David Krum ◽  
Joshua Biro ◽  
...  

Virtual reality, augmented reality, and other forms of virtual environments have the potential to dramatically change how individuals work, learn, and interact with each other. A key objective of human factors research and practice is to determine how these environments should be designed to maximize performance efficiency, ensure health and safety, and circumvent potential human virtual environment interaction problems. This session will demonstrate some of the distinct and diverse uses of virtual reality, mixed reality, and virtual environments in an alternative format. The session will begin with each demonstrator providing a brief overview of their virtual environment and describing how it has been used to address a particular problem or research need. Following the description portion of the session, all demonstrations will be set-up around the room, and session attendees will be encouraged to directly interact with the environment and ask demonstrators questions about their research and inquire about the effectiveness of using their virtual environment for research, training, and evaluation purposes. The overall objective of this alternative session is to provoke ideas among the attendees for how virtual reality, mixed reality, and virtual environments can help address their research, training, education or business needs.


2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Muyinat Y. Osaba ◽  
Dario Martelli ◽  
Antonio Prado ◽  
Sunil K. Agrawal ◽  
Anil K. Lalwani

Abstract Older adults have difficulty adapting to new visual information, posing a challenge to maintain balance during walking. Virtual reality can be used to study gait adaptability in response to discordant sensorimotor stimulations. This study aimed to investigate age-related modifications and propensity for visuomotor adaptations due to continuous visual perturbations during overground walking in a virtual reality headset. Twenty old and twelve young subjects walked on an instrumented walkway in real and virtual environments while reacting to antero-posterior and medio-lateral oscillations of the visual field. Mean and variability of spatiotemporal gait parameters were calculated during the first and fifth minutes of walking. A 3-way mixed-design ANOVA was performed to determine the main and interaction effects of group, condition and time. Both groups modified gait similarly, but older adults walked with shorter and slower strides and did not reduce stride velocity or increase stride width variability during medio-lateral perturbations. This may be related to a more conservative and anticipatory strategy as well as a reduced perception of the optic flow. Over time, participants adapted similarly to the perturbations but only younger participants reduced their stride velocity variability. Results provide novel evidence of age- and context-dependent visuomotor adaptations in response to visual perturbations during overground walking and may help to establish new methods for early identification and remediation of gait deficits.


2020 ◽  
Vol 11 (1) ◽  
pp. 99-106
Author(s):  
Marián Hudák ◽  
Štefan Korečko ◽  
Branislav Sobota

AbstractRecent advances in the field of web technologies, including the increasing support of virtual reality hardware, have allowed for shared virtual environments, reachable by just entering a URL in a browser. One contemporary solution that provides such a shared virtual reality is LIRKIS Global Collaborative Virtual Environments (LIRKIS G-CVE). It is a web-based software system, built on top of the A-Frame and Networked-Aframe frameworks. This paper describes LIRKIS G-CVE and introduces its two original components. The first one is the Smart-Client Interface, which turns smart devices, such as smartphones and tablets, into input devices. The advantage of this component over the standard way of user input is demonstrated by a series of experiments. The second component is the Enhanced Client Access layer, which provides access to positions and orientations of clients that share a virtual environment. The layer also stores a history of connected clients and provides limited control over the clients. The paper also outlines an ongoing experiment aimed at an evaluation of LIRKIS G-CVE in the area of virtual prototype testing.


Sign in / Sign up

Export Citation Format

Share Document