User Generated Content for Site Based Training in Virtual Reality

2021 ◽  
Author(s):  
Ivory Mayhorn ◽  
Kyle R. Daughtry ◽  
Athicha Dhanormchitphong ◽  
Mitchell B. Bray

Abstract Objectives/Scope Through a partnership with Global Projects and the ExxonMobil IT, we set out to change the way enabling enhanced site specific operator training and model reviews are done on large complex models. The concept was to have the ability to view and navigate complex green-field 3D CAD models in VR (virtual reality) technology to aid in training, model reviews, procedure development, rounds development, maintenance planning and execution, emergency response planning / drills, and project planning. The automated toolset can be used to conduct model reviews followed by training and preparing operators for commissioning before and after site construction is completed. Methods, Procedures, Process Leveraging the 3D CAD files from Engineering Procurement & Construction (EPC) contractors, the feature set allows the ability to create a fully textured 3D Model walk-through (annotated model review), and a content creation application to easily create user generated training scenarios (similar to PowerPoint drag and drop). In the past few months, 20 onsite stations were setup and over 100 first and second line supervisors and operators leveraged the toolset. Baseline metrics were captured with an overwhelming success. Ongoing metrics collections will continue for several months to drive further adjustments on the toolset to ensure high value capture. This toolset, once fully refined, will allow other capital & global projects the ability to train operators prior to the unit being built and ongoing for operations activities bringing pieces of the Digital Twin concept to life. Results, Observations, Conclusions Technical Benefits Business Benefit Novel/Additive Information

Author(s):  
Margherita Peruzzini ◽  
Maura Mengoni ◽  
Michele Germani

The promise of Virtual Reality in design environments is to facilitate the interaction with digital models and to enhance the results of design activity. Design education is one of the most recent and interesting applications. Thanks to technological advances in human-computer interfaces, Virtual Reality represents a new way to stimulate design students and to develop innovative teaching methods. The paper explores the impact of Virtual Reality technologies on design learning, with particular attention to mechanical product design. It is focused on the analysis of cognitive and technical aspects of learning processes and the definition of a proper evaluation protocol. The protocol is based on the classification of the most meaningful activities in mechanical engineering teaching and the identification of a set of metrics that enable to objectively evaluate the learning process. Assessing how VR supports design education, an experimental study is proposed. It is based on the comparison of three different approaches performed by two-dimensional drawings, by 3D CAD models and, finally, by virtual reality technologies.


Author(s):  
Nikola Horvat ◽  
Stanko Škec ◽  
Tomislav Martinec ◽  
Fanika Lukačević ◽  
Marija Majda Perišić

AbstractUse of virtual reality (VR) is considered beneficial for reviewing 3D models throughout product design. However, research on its usability in the design field is still explorative, and previous studies are often contradictory regarding the usability of VR for 3D model review. This paper argues that the usability of VR should be assessed by analysing human factors such as spatial perception and taking into consideration the complexity of the reviewed product. Hence, a comparative evaluation study has been conducted to assess spatial perception in desktop interface-based and VR-based review of 3D models of products with different levels of complexity. The results show that participants in VR more could perceive the fit of user interface elements, and estimation of the model dimensions had a lower relative error than in desktop interface. It has been found that various sensory cues are used to perceive the model size and that the employed sensory cues depend on the level of complexity. Finally, it is proposed that differences between a desktop interface and VR for reviewing models are more evident when reviewing models of higher complexity levels.


Sensors ◽  
2021 ◽  
Vol 21 (14) ◽  
pp. 4663
Author(s):  
Janaina Cavalcanti ◽  
Victor Valls ◽  
Manuel Contero ◽  
David Fonseca

An effective warning attracts attention, elicits knowledge, and enables compliance behavior. Game mechanics, which are directly linked to human desires, stand out as training, evaluation, and improvement tools. Immersive virtual reality (VR) facilitates training without risk to participants, evaluates the impact of an incorrect action/decision, and creates a smart training environment. The present study analyzes the user experience in a gamified virtual environment of risks using the HTC Vive head-mounted display. The game was developed in the Unreal game engine and consisted of a walk-through maze composed of evident dangers and different signaling variables while user action data were recorded. To demonstrate which aspects provide better interaction, experience, perception and memory, three different warning configurations (dynamic, static and smart) and two different levels of danger (low and high) were presented. To properly assess the impact of the experience, we conducted a survey about personality and knowledge before and after using the game. We proceeded with the qualitative approach by using questions in a bipolar laddering assessment that was compared with the recorded data during the game. The findings indicate that when users are engaged in VR, they tend to test the consequences of their actions rather than maintaining safety. The results also reveal that textual signal variables are not accessed when users are faced with the stress factor of time. Progress is needed in implementing new technologies for warnings and advance notifications to improve the evaluation of human behavior in virtual environments of high-risk surroundings.


2020 ◽  
Vol 22 (Supplement_3) ◽  
pp. iii461-iii461
Author(s):  
Andrea Carai ◽  
Angela Mastronuzzi ◽  
Giovanna Stefania Colafati ◽  
Paul Voicu ◽  
Nicola Onorini ◽  
...  

Abstract Tridimensional (3D) rendering of volumetric neuroimaging is increasingly been used to assist surgical management of brain tumors. New technologies allowing immersive virtual reality (VR) visualization of obtained models offer the opportunity to appreciate neuroanatomical details and spatial relationship between the tumor and normal neuroanatomical structures to a level never seen before. We present our preliminary experience with the Surgical Theatre, a commercially available 3D VR system, in 60 consecutive neurosurgical oncology cases. 3D models were developed from volumetric CT scans and MR standard and advanced sequences. The system allows the loading of 6 different layers at the same time, with the possibility to modulate opacity and threshold in real time. Use of the 3D VR was used during preoperative planning allowing a better definition of surgical strategy. A tailored craniotomy and brain dissection can be simulated in advanced and precisely performed in the OR, connecting the system to intraoperative neuronavigation. Smaller blood vessels are generally not included in the 3D rendering, however, real-time intraoperative threshold modulation of the 3D model assisted in their identification improving surgical confidence and safety during the procedure. VR was also used offline, both before and after surgery, in the setting of case discussion within the neurosurgical team and during MDT discussion. Finally, 3D VR was used during informed consent, improving communication with families and young patients. 3D VR allows to tailor surgical strategies to the single patient, contributing to procedural safety and efficacy and to the global improvement of neurosurgical oncology care.


2021 ◽  
Vol 11 (4) ◽  
pp. 145
Author(s):  
Nenad Bojcetic ◽  
Filip Valjak ◽  
Dragan Zezelj ◽  
Tomislav Martinec

The article describes an attempt to address the automatized evaluation of student three-dimensional (3D) computer-aided design (CAD) models. The driving idea was conceptualized under the restraints of the COVID pandemic, driven by the problem of evaluating a large number of student 3D CAD models. The described computer solution can be implemented using any CAD computer application that supports customization. Test cases showed that the proposed solution was valid and could be used to evaluate many students’ 3D CAD models. The computer solution can also be used to help students to better understand how to create a 3D CAD model, thereby complying with the requirements of particular teachers.


2021 ◽  
Author(s):  
Weijuan Cao ◽  
Trevor Robinson ◽  
Hua Yang ◽  
Flavien Boussuge ◽  
Andrew Colligan ◽  
...  

2021 ◽  
Vol 16 (11) ◽  
pp. C11013
Author(s):  
J.M. Santos ◽  
E. Ricardo ◽  
F.J. da Silva ◽  
T. Ribeiro ◽  
S. Heuraux ◽  
...  

Abstract The use of advanced simulation has become increasingly more important in the planning, design, and assessment phases of future fusion plasma diagnostics, and in the interpretation of experimental data from existing ones. The design cycle of complex reflectometry systems, such as the ones being planned for next generation machines (IDTT and DEMO), relies heavily on the results produced by synthetic diagnostics, used for system performance evaluation and prediction, both crucial in the design process decision making. These synthetic diagnostics need realistic representations of all system components to incorporate the main effects that shape their behavior. Some of the most important elements that are required to be well modelled and integrated in simulations are the wave launcher structures, such as the waveguides, tapers, and antennas, as well as the vessel wall structures and access to the plasma. The latter are of paramount importance and are often neglected in this type of studies. Faithfully modelling them is not an easy task, especially in 3D simulations. The procedure herein proposed consists in using CAD models of a given machine, together with parameterizable models of the launcher, to produce a description suited for Finite Difference Time Domain (FDTD) 3D simulation, combining the capabilities of real-world CAD design with the power of simulation. However, CAD model geometric descriptions are incompatible with the ones used by standard FDTD codes. CAD software usually outputs models in a tessellated mesh while FDTD simulators use Volumetric Pixel (VOXEL) descriptions. To solve this interface problem, we implemented a pipeline to automatically convert complex CAD models of tokamak vessel components and wave launcher structures to the VOXEL input required by REFMUL3, a full wave 3D Maxwell FDTD parallel code. To illustrate the full procedure, a complex reflectometry synthetic diagnostic for IDTT was setup, converted and simulated. This setup includes 3 antennas recessed into the vessel wall, for thermal protection, one for transmission and reception, and two just for reception.


Author(s):  
Meisha Rosenberg ◽  
Judy M. Vance

Successful collaborative design requires in-depth communication between experts from different disciplines. Many design decisions are made based on a shared mental model and understanding of key features and functions before the first prototype is built. Large-Scale Immersive Computing Environments (LSICEs) provide the opportunity for teams of experts to view and interact with 3D CAD models using natural human motions to explore potential design configurations. This paper presents the results of a class exercise where student design teams used an LSICE to examine their design ideas and make decisions during the design process. The goal of this research is to gain an understanding of (1) whether the decisions made by the students are improved by full-scale visualizations of their designs in LSICEs, (2) how the use of LSICEs affect the communication of students with collaborators and clients, and (3) how the interaction methods provided in LSICEs affect the design process. The results of this research indicate that the use of LSICEs improves communication among design team members.


2009 ◽  
Vol 14 (4) ◽  
pp. 283-286 ◽  
Author(s):  
Vera Leibovici ◽  
Florella Magora ◽  
Sarale Cohen ◽  
Arieh Ingber

BACKGROUND: Virtual reality immersion (VRI), an advanced computer-generated technique, decreased subjective reports of pain in experimental and procedural medical therapies. Furthermore, VRI significantly reduced pain-related brain activity as measured by functional magnetic resonance imaging. Resemblance between anatomical and neuroendocrine pathways of pain and pruritus may prove VRI to be a suitable adjunct for basic and clinical studies of the complex aspects of pruritus.OBJECTIVES: To compare effects of VRI with audiovisual distraction (AVD) techniques for attenuation of pruritus in patients with atopic dermatitis and psoriasis vulgaris.METHODS: Twenty-four patients suffering from chronic pruritus – 16 due to atopic dermatitis and eight due to psoriasis vulgaris – were randomly assigned to play an interactive computer game using a special visor or a computer screen. Pruritus intensity was self-rated before, during and 10 min after exposure using a visual analogue scale ranging from 0 to 10. The interviewer rated observed scratching on a three-point scale during each distraction program.RESULTS: Student’sttests were significant for reduction of pruritus intensity before and during VRI and AVD (P=0.0002 and P=0.01, respectively) and were significant only between ratings before and after VRI (P=0.017). Scratching was mostly absent or mild during both programs.CONCLUSIONS: VRI and AVD techniques demonstrated the ability to diminish itching sensations temporarily. Further studies on the immediate and late effects of interactive computer distraction techniques to interrupt itching episodes will open potential paths for future pruritus research.


Sign in / Sign up

Export Citation Format

Share Document