scholarly journals SURG-03. IMMERSIVE VIRTUAL REALITY APPLICATIONS IN NEUROSURGICAL ONCOLOGY

2020 ◽  
Vol 22 (Supplement_3) ◽  
pp. iii461-iii461
Author(s):  
Andrea Carai ◽  
Angela Mastronuzzi ◽  
Giovanna Stefania Colafati ◽  
Paul Voicu ◽  
Nicola Onorini ◽  
...  

Abstract Tridimensional (3D) rendering of volumetric neuroimaging is increasingly been used to assist surgical management of brain tumors. New technologies allowing immersive virtual reality (VR) visualization of obtained models offer the opportunity to appreciate neuroanatomical details and spatial relationship between the tumor and normal neuroanatomical structures to a level never seen before. We present our preliminary experience with the Surgical Theatre, a commercially available 3D VR system, in 60 consecutive neurosurgical oncology cases. 3D models were developed from volumetric CT scans and MR standard and advanced sequences. The system allows the loading of 6 different layers at the same time, with the possibility to modulate opacity and threshold in real time. Use of the 3D VR was used during preoperative planning allowing a better definition of surgical strategy. A tailored craniotomy and brain dissection can be simulated in advanced and precisely performed in the OR, connecting the system to intraoperative neuronavigation. Smaller blood vessels are generally not included in the 3D rendering, however, real-time intraoperative threshold modulation of the 3D model assisted in their identification improving surgical confidence and safety during the procedure. VR was also used offline, both before and after surgery, in the setting of case discussion within the neurosurgical team and during MDT discussion. Finally, 3D VR was used during informed consent, improving communication with families and young patients. 3D VR allows to tailor surgical strategies to the single patient, contributing to procedural safety and efficacy and to the global improvement of neurosurgical oncology care.

Sensors ◽  
2021 ◽  
Vol 21 (14) ◽  
pp. 4663
Author(s):  
Janaina Cavalcanti ◽  
Victor Valls ◽  
Manuel Contero ◽  
David Fonseca

An effective warning attracts attention, elicits knowledge, and enables compliance behavior. Game mechanics, which are directly linked to human desires, stand out as training, evaluation, and improvement tools. Immersive virtual reality (VR) facilitates training without risk to participants, evaluates the impact of an incorrect action/decision, and creates a smart training environment. The present study analyzes the user experience in a gamified virtual environment of risks using the HTC Vive head-mounted display. The game was developed in the Unreal game engine and consisted of a walk-through maze composed of evident dangers and different signaling variables while user action data were recorded. To demonstrate which aspects provide better interaction, experience, perception and memory, three different warning configurations (dynamic, static and smart) and two different levels of danger (low and high) were presented. To properly assess the impact of the experience, we conducted a survey about personality and knowledge before and after using the game. We proceeded with the qualitative approach by using questions in a bipolar laddering assessment that was compared with the recorded data during the game. The findings indicate that when users are engaged in VR, they tend to test the consequences of their actions rather than maintaining safety. The results also reveal that textual signal variables are not accessed when users are faced with the stress factor of time. Progress is needed in implementing new technologies for warnings and advance notifications to improve the evaluation of human behavior in virtual environments of high-risk surroundings.


Author(s):  
P. Clini ◽  
L. Ruggeri ◽  
R. Angeloni ◽  
M. Sasso

Thanks to their playful and educational approach Virtual Museum systems are very effective for the communication of Cultural Heritage. Among the latest technologies Immersive Virtual Reality is probably the most appealing and potentially effective to serve this purpose; nevertheless, due to a poor user-system interaction, caused by an incomplete maturity of a specific technology for museum applications, it is still quite uncommon to find immersive installations in museums.<br> This paper explore the possibilities offered by this technology and presents a workflow that, starting from digital documentation, makes possible an interaction with archaeological finds or any other cultural heritage inside different kinds of immersive virtual reality spaces.<br> Two different cases studies are presented: the National Archaeological Museum of Marche in Ancona and the 3D reconstruction of the Roman Forum of Fanum Fortunae. Two different approaches not only conceptually but also in contents; while the Archaeological Museum is represented in the application simply using spherical panoramas to give the perception of the third dimension, the Roman Forum is a 3D model that allows visitors to move in the virtual space as in the real one.<br> In both cases, the acquisition phase of the artefacts is central; artefacts are digitized with the photogrammetric technique Structure for Motion then they are integrated inside the immersive virtual space using a PC with a HTC Vive system that allows the user to interact with the 3D models turning the manipulation of objects into a fun and exciting experience.<br> The challenge, taking advantage of the latest opportunities made available by photogrammetry and ICT, is to enrich visitors’ experience in Real Museum making possible the interaction with perishable, damaged or lost objects and the public access to inaccessible or no longer existing places promoting in this way the preservation of fragile sites.


2021 ◽  
pp. e20210009
Author(s):  
Katherine McCaw ◽  
Andrew West ◽  
Colleen Duncan ◽  
Danielle Frey ◽  
Felix Duerr

The COVID-19 pandemic has catalyzed the use of novel teaching modalities to enhance the provision of remote veterinary education. In this study, we describe the use of immersive virtual reality (iVR) as a teaching aid for veterinary medicine students during their orthopedics clinical rotation. Student sentiments were assessed using voluntary electronic surveys taken by veterinary students before and after the rotation. The most noteworthy benefits students reported were improved engagement with the course content, information retention, radiographic interpretation, and clinical reasoning skills. Obstacles encountered during the initial stages of the program included financial and temporal investment in equipment and content development, technical troubleshooting, and motion sickness. Though it is unlikely that iVR will ever fully replace hands-on learning experiences, it presents an educational opportunity to supplement traditional learning methods, motivate students, and fill information gaps. As iVR technology continues to evolve and improve, potential applications in the veterinary curriculum grow, making the modality’s use progressively more advantageous. Although this study describes its application in an orthopedic setting, the versatility of the iVR modality lends the potential for it to be implemented in a number of clinical and didactic settings.


2018 ◽  
Vol 9 (6) ◽  
pp. 2825 ◽  
Author(s):  
Mark Draelos ◽  
Brenton Keller ◽  
Christian Viehland ◽  
Oscar M. Carrasco-Zevallos ◽  
Anthony Kuo ◽  
...  

Author(s):  
Gabriele Montecchiari ◽  
Gabriele Bulian ◽  
Paolo Gallina

The analysis of the ship layout from the point of view of safe and orderly evacuation represents an important step in ship design, which can be carried out through agent-based evacuation simulation tools, typically run in batch mode. Introducing the possibility for humans to interactively participate in a simulated evacuation process together with computer-controlled agents can open a series of interesting possibilities for design, research and development. To this aim, this article presents the development of a validated agent-based evacuation simulation tool which allows real-time human participation through immersive virtual reality. The main characteristics of the underlying social-force-based modelling technique are described. The tool is verified and validated by making reference to International Maritime Organization test cases, experimental data and FDS + Evac simulations. The first approach for supporting real-time human participation is then presented. An initial experiment embedding immersive virtual reality human participation is described, together with outcomes regarding comparisons between human-controlled avatars and computer-controlled agents. Results from this initial testing are encouraging in pursuing the use of virtual reality as a tool to obtain information on human behaviour during evacuation.


Author(s):  
Kevin Lesniak ◽  
Conrad S. Tucker ◽  
Sven Bilen ◽  
Janis Terpenny ◽  
Chimay Anumba

Immersive virtual reality systems have the potential to transform the manner in which designers create prototypes and collaborate in teams. Using technologies such as the Oculus Rift or the HTC Vive, a designer can attain a sense of “presence” and “immersion” typically not experienced by traditional CAD-based platforms. However, one of the fundamental challenges of creating a high quality immersive virtual reality experience is actually creating the immersive virtual reality environment itself. Typically, designers spend a considerable amount of time manually designing virtual models that replicate physical, real world artifacts. While there exists the ability to import standard 3D models into these immersive virtual reality environments, these models are typically generic in nature and do not represent the designer’s intent. To mitigate these challenges, the authors of this work propose the real time translation of physical objects into an immersive virtual reality environment using readily available RGB-D sensing systems and standard networking connections. The emergence of commercial, off-the shelf RGB-D sensing systems such as the Microsoft Kinect, have enabled the rapid 3D reconstruction of physical environments. The authors present a methodology that employs 3D mesh reconstruction algorithms and real time rendering techniques to capture physical objects in the real world and represent their 3D reconstruction in an immersive virtual realilty environment with which the user can then interact. A case study involving a commodity RGB-D sensor and multiple computers connected through standard TCP internet connections is presented to demonstrate the viability of the proposed methodology.


2021 ◽  
Vol 17 (3) ◽  
pp. 415-431 ◽  
Author(s):  
Martina Paatela-Nieminen

This article explores digital material/ism by examining student teachers’ experiences, processes and products with fully immersive virtual reality (VR) as part of visual art education. The students created and painted a virtual world, given the name Gretan puutarha (‘Greta’s Garden’), using the Google application Tilt Brush. They also applied photogrammetry techniques to scan 3D objects from the real world in order to create 3D models for their VR world. Additionally, they imported 2D photographs and drawings along with applied animated effects to construct their VR world digitally, thereby remixing elements from real life and fantasy. The students were asked open-ended questions to find out how they created art virtually and the results were analysed using Burdea’s VR concepts of immersion, interaction and imagination. Digital material was created intersubjectively and intermedially while it was also remixed with real and imaginary. Various webs of meanings were created, both intertextual and rhizomatic in nature.


Author(s):  
Christian Boedecker ◽  
Florentine Huettl ◽  
Patrick Saalfeld ◽  
Markus Paschold ◽  
Werner Kneist ◽  
...  

Abstract Purpose Three-dimensional (3D) surgical planning is widely accepted in liver surgery. Currently, the 3D reconstructions are usually presented as 3D PDF data on regular monitors. 3D-printed liver models are sometimes used for education and planning. Methods We developed an immersive virtual reality (VR) application that enables the presentation of preoperative 3D models. The 3D reconstructions are exported as STL files and easily imported into the application, which creates the virtual model automatically. The presentation is possible in “OpenVR”-ready VR headsets. To interact with the 3D liver model, VR controllers are used. Scaling is possible, as well as changing the opacity from invisible over transparent to fully opaque. In addition, the surgeon can draw potential resection lines on the surface of the liver. All these functions can be used in a single or multi-user mode. Results Five highly experienced HPB surgeons of our department evaluated the VR application after using it for the very first time and considered it helpful according to the “System Usability Scale” (SUS) with a score of 76.6%. Especially with the subitem “necessary learning effort,” it was shown that the application is easy to use. Conclusion We introduce an immersive, interactive presentation of medical volume data for preoperative 3D liver surgery planning. The application is easy to use and may have advantages over 3D PDF and 3D print in preoperative liver surgery planning. Prospective trials are needed to evaluate the optimal presentation mode of 3D liver models.


Sign in / Sign up

Export Citation Format

Share Document