scholarly journals The Holographic Human for Surgical Navigation using Microsoft HoloLens

10.29007/wjjx ◽  
2018 ◽  
Author(s):  
Tomoki Itamiya ◽  
Toshinori Iwai ◽  
Tsuyoshi Kaneko

In surgical navigation, to accurately know the position of a surgical instrument in a patient's body is very important. Using transparent smart glasses is very useful for surgical navigation because a surgeon does not need to move his/her line of sight from the operative field. We propose a new application software development method that is able to show a stereoscopic vision of highly precise 3D-CG medical models and surgical instruments using transparent smart glasses for surgical navigation. We used Mixed Reality (MR) which is a concept exceeding Augmented Reality (AR) by using Microsoft HoloLens. In Mixed Reality, persons, places, and objects from our physical and virtual worlds merge together in a blended environment. Unlike competitive models, HoloLens can recognize surrounding with a front-facing cameras and 3D depth sensors. External markers and sensors are not required for surrounding recognition. Once a 3D-CG medical model is placed in a blended environment, it is fixed to the place and does not move on its own. Therefore, we can see a stereoscopic vision of a precise medical model projected into our surrounding such as a holographic human. A holographic human is as if he/she is there, which is a special immersive experience we have never felt before. A holographic human can not only be seen, but also can be moved by user’s hand gestures and interactive manipulation is possible. A holographic human and 3D-CG surgical instrument can be displayed simultaneously in a blended environment. The movement of 3D-CG surgical instruments can be linked with actual surgical instruments in the operation room. In the operation room, the holographic human is superimposed on the actual patient position. Since the positional relationship between the holographic human and surgical instruments is clear because it is overlapping, so it is very useful for surgical navigation. Multiple persons can see one holographic human at the same time using multiple HoloLenses. We developed the holographic human application software for surgical navigation using Unity and Vuforia, which are a development software and a library. A holographic vision of a 3D-CG medical model made from an actual patient’s CT/MRI image data is possible using our application software development method. A user can make the application software within only five minutes by preparing 3D-CG medical model file for instance STL. Therefore, surgeon dentists and clinical staff can make the holographic human content easily by themselves. As a result, the method can be utilized daily for routine medical treatment and education.

2021 ◽  
Vol 108 (Supplement_2) ◽  
Author(s):  
D Vijayan ◽  
K Malik ◽  
K Natarajan ◽  
J Berland ◽  
D Morton ◽  
...  

Abstract Aim The COVID19 pandemic has accelerated the need for staff to work remotely. Our aim was to demonstrate how a next-generation digital platform could be used to create a virtual MDT ecosystem in order to manipulate holographic 2D and 3D images in real-time. Method This study involved setting up a mock virtual MDT using de-identified DICOM files from a patient who had been treated for colorectal cancer and then subsequently found to have a liver metastasis. The image file was segmented and converted into a 2D and 3D format for visualisation within Microsoft HoloLens 2 ® (smart glasses) using Holocare Solutions ® (Mixed Reality software). Results A seamless cross-border pipeline was developed that involved "clinician" training, DICOM segmentation and virtual connection. We successfully performed a virtual MDT with participants able to visualise and manipulate a virtual 3D organ in real-time. The digital network remotely connected sites in England and Norway. The streaming quality was stable and HIPAA compliant. Each participant could observe others as "avatars" interacting with images within the virtual ecosystem allowing image characteristics to be highlighted. Conclusions We successfully conducted a virtual MDT using novel hardware and software. Our intention is to conduct a large-scale study to assess the platform's effectiveness in “Real World" MDTs.


2021 ◽  
Vol 7 (2) ◽  
pp. 61-66
Author(s):  
Jozef Husár ◽  
Lucia Knapčíková

The presented article points to the combination of mixed reality with advanced robotics and manipulators. It is a current trend and synonymous with the word industry 5.0, where human-machine interaction is an important element. This element is collaborative robots in cooperation with intelligent smart glasses. In the article, we gradually defined the basic elements of the investigated system. We showed how to operate them to control a collaborative robot online and offline using mixed reality. We pointed out the software and hardware side of a specific design. In the practical part, we provided illustrative examples of a robotic workplace, which was displayed using smart glasses Microsoft HoloLens 2. In conclusion, we can say that the current trends in industry 4.0 significantly affect and accelerate activities in manufacturing companies. Therefore, it is necessary to prepare for the arrival of Industry 5.0, which will focus primarily on collaborative robotics.


Author(s):  
Andrea Teatini ◽  
Rahul P. Kumar ◽  
Ole Jakob Elle ◽  
Ola Wiig

Abstract Purpose This study presents a novel surgical navigation tool developed in mixed reality environment for orthopaedic surgery. Joint and skeletal deformities affect all age groups and greatly reduce the range of motion of the joints. These deformities are notoriously difficult to diagnose and to correct through surgery. Method We have developed a surgical tool which integrates surgical instrument tracking and augmented reality through a head mounted display. This allows the surgeon to visualise bones with the illusion of possessing “X-ray” vision. The studies presented below aim to assess the accuracy of the surgical navigation tool in tracking a location at the tip of the surgical instrument in holographic space. Results Results show that the average accuracy provided by the navigation tool is around 8 mm, and qualitative assessment by the orthopaedic surgeons provided positive feedback in terms of the capabilities for diagnostic use. Conclusions More improvements are necessary for the navigation tool to be accurate enough for surgical applications, however, this new tool has the potential to improve diagnostic accuracy and allow for safer and more precise surgeries, as well as provide for better learning conditions for orthopaedic surgeons in training.


2020 ◽  
Vol 15 (12) ◽  
pp. 2027-2039
Author(s):  
Javier A. Luzon ◽  
Bojan V. Stimec ◽  
Arne O. Bakka ◽  
Bjørn Edwin ◽  
Dejan Ignjatovic

Abstract Purpose Mixed reality (MR) is being evaluated as a visual tool for surgical navigation. Current literature presents unclear results on intraoperative accuracy using the Microsoft HoloLens 1®. This study aims to assess the impact of the surgeon’s sightline in an inside-out marker-based MR navigation system for open surgery. Methods Surgeons at Akershus University Hospital tested this system. A custom-made phantom was used, containing 18 wire target crosses within its inner walls. A CT scan was obtained in order to segment all wire targets into a single 3D-model (hologram). An in-house software application (CTrue), developed for the Microsoft HoloLens 1, uploaded 3D-models and automatically registered the 3D-model with the phantom. Based on the surgeon’s sightline while registering and targeting (free sightline /F/or a strictly perpendicular sightline /P/), 4 scenarios were developed (FF-PF-FP-PP). Target error distance (TED) was obtained in three different working axes-(XYZ). Results Six surgeons (5 males, age 29–62) were enrolled. A total of 864 measurements were collected in 4 scenarios, twice. Scenario PP showed the smallest TED in XYZ-axes mean = 2.98 mm ± SD 1.33; 2.28 mm ± SD 1.45; 2.78 mm ± SD 1.91, respectively. Scenario FF showed the largest TED in XYZ-axes with mean = 10.03 mm ± SD 3.19; 6.36 mm ± SD 3.36; 16.11 mm ± SD 8.91, respectively. Multiple comparison tests, grouped in scenarios and axes, showed that the majority of scenario comparisons had significantly different TED values (p < 0.05). Y-axis always presented the smallest TED regardless of scenario tested. Conclusion A strictly perpendicular working sightline in relation to the 3D-model achieves the best accuracy results. Shortcomings in this technology, as an intraoperative visual cue, can be overcome by sightline correction. Incidentally, this is the preferred working angle for open surgery.


2021 ◽  
Vol 1 ◽  
pp. 2107-2116
Author(s):  
Agnese Brunzini ◽  
Alessandra Papetti ◽  
Michele Germani ◽  
Erica Adrario

AbstractIn the medical education field, the use of highly sophisticated simulators and extended reality (XR) simulations allow training complex procedures and acquiring new knowledge and attitudes. XR is considered useful for the enhancement of healthcare education; however, several issues need further research.The main aim of this study is to define a comprehensive method to design and optimize every kind of simulator and simulation, integrating all the relevant elements concerning the scenario design and prototype development.A complete framework for the design of any kind of advanced clinical simulation is proposed and it has been applied to realize a mixed reality (MR) prototype for the simulation of the rachicentesis. The purpose of the MR application is to immerse the trainee in a more realistic environment and to put him/her under pressure during the simulation, as in real practice.The application was tested with two different devices: the headset Vox Gear Plus for smartphone and the Microsoft Hololens. Eighteen students of the 6th year of Medicine and Surgery Course were enrolled in the study. Results show the comparison of user experience related to the two different devices and simulation performance using the Hololens.


Sign in / Sign up

Export Citation Format

Share Document