scholarly journals HoloLens-Based Vascular Localization System: Precision Evaluation Study With a Three-Dimensional Printed Model

10.2196/16852 ◽  
2020 ◽  
Vol 22 (4) ◽  
pp. e16852
Author(s):  
Taoran Jiang ◽  
Dewang Yu ◽  
Yuqi Wang ◽  
Tao Zan ◽  
Shuyi Wang ◽  
...  

Background Vascular localization is crucial for perforator flap transfer. Augmented reality offers a novel method to seamlessly combine real information with virtual objects created by computed tomographic angiography to help the surgeon “see through” the skin and precisely localize the perforator. The head-mounted display augmented reality system HoloLens (Microsoft) could facilitate augmented reality–based perforator localization for a more convenient and safe procedure. Objective The aim of this study was to evaluate the precision of the HoloLens-based vascular localization system, as the most important performance indicator of a new localization system. Methods The precision of the HoloLens-based vascular localization system was tested in a simulated operating room under different conditions with a three-dimensional (3D) printed model. The coordinates of five pairs of points on the vascular map that could be easily identified on the 3D printed model and virtual model were detected by a probe, and the distance between the corresponding points was calculated as the navigation error. Results The mean errors were determined under different conditions, with a minimum error of 1.35 mm (SD 0.43) and maximum error of 3.18 mm (SD 1.32), which were within the clinically acceptable range. There were no significant differences in the errors obtained under different visual angles, different light intensities, or different states (static or motion). However, the error was larger when tested with light compared with that tested without light. Conclusions This precision evaluation demonstrated that the HoloLens system can precisely localize the perforator and potentially help the surgeon accomplish the operation. The authors recommend using HoloLens-based surgical navigation without light.

2019 ◽  
Author(s):  
Taoran Jiang ◽  
Dewang Yu ◽  
Yuqi Wang ◽  
Tao Zan ◽  
Shuyi Wang ◽  
...  

BACKGROUND Vascular localization is crucial for perforator flap transfer. Augmented reality offers a novel method to seamlessly combine real information with virtual objects created by computed tomographic angiography to help the surgeon “see through” the skin and precisely localize the perforator. The head-mounted display augmented reality system HoloLens (Microsoft) could facilitate augmented reality–based perforator localization for a more convenient and safe procedure. OBJECTIVE The aim of this study was to evaluate the precision of the HoloLens-based vascular localization system, as the most important performance indicator of a new localization system. METHODS The precision of the HoloLens-based vascular localization system was tested in a simulated operating room under different conditions with a three-dimensional (3D) printed model. The coordinates of five pairs of points on the vascular map that could be easily identified on the 3D printed model and virtual model were detected by a probe, and the distance between the corresponding points was calculated as the navigation error. RESULTS The mean errors were determined under different conditions, with a minimum error of 1.35 mm (SD 0.43) and maximum error of 3.18 mm (SD 1.32), which were within the clinically acceptable range. There were no significant differences in the errors obtained under different visual angles, different light intensities, or different states (static or motion). However, the error was larger when tested with light compared with that tested without light. CONCLUSIONS This precision evaluation demonstrated that the HoloLens system can precisely localize the perforator and potentially help the surgeon accomplish the operation. The authors recommend using HoloLens-based surgical navigation without light.


Sensors ◽  
2021 ◽  
Vol 21 (4) ◽  
pp. 1370
Author(s):  
Rafael Moreta-Martinez ◽  
Alicia Pose-Díez-de-la-Lastra ◽  
José Antonio Calvo-Haro ◽  
Lydia Mediavilla-Santos ◽  
Rubén Pérez-Mañanes ◽  
...  

During the last decade, orthopedic oncology has experienced the benefits of computerized medical imaging to reduce human dependency, improving accuracy and clinical outcomes. However, traditional surgical navigation systems do not always adapt properly to this kind of interventions. Augmented reality (AR) and three-dimensional (3D) printing are technologies lately introduced in the surgical environment with promising results. Here we present an innovative solution combining 3D printing and AR in orthopedic oncological surgery. A new surgical workflow is proposed, including 3D printed models and a novel AR-based smartphone application (app). This app can display the patient’s anatomy and the tumor’s location. A 3D-printed reference marker, designed to fit in a unique position of the affected bone tissue, enables automatic registration. The system has been evaluated in terms of visualization accuracy and usability during the whole surgical workflow. Experiments on six realistic phantoms provided a visualization error below 3 mm. The AR system was tested in two clinical cases during surgical planning, patient communication, and surgical intervention. These results and the positive feedback obtained from surgeons and patients suggest that the combination of AR and 3D printing can improve efficacy, accuracy, and patients’ experience.


Sensors ◽  
2021 ◽  
Vol 21 (23) ◽  
pp. 7824
Author(s):  
Mónica García-Sevilla ◽  
Rafael Moreta-Martinez ◽  
David García-Mato ◽  
Alicia Pose-Diez-de-la-Lastra ◽  
Rubén Pérez-Mañanes ◽  
...  

Patient-specific instruments (PSIs) have become a valuable tool for osteotomy guidance in complex surgical scenarios such as pelvic tumor resection. They provide similar accuracy to surgical navigation systems but are generally more convenient and faster. However, their correct placement can become challenging in some anatomical regions, and it cannot be verified objectively during the intervention. Incorrect installations can result in high deviations from the planned osteotomy, increasing the risk of positive resection margins. In this work, we propose to use augmented reality (AR) to guide and verify PSIs placement. We designed an experiment to assess the accuracy provided by the system using a smartphone and the HoloLens 2 and compared the results with the conventional freehand method. The results showed significant differences, where AR guidance prevented high osteotomy deviations, reducing maximal deviation of 54.03 mm for freehand placements to less than 5 mm with AR guidance. The experiment was performed in two versions of a plastic three-dimensional (3D) printed phantom, one including a silicone layer to simulate tissue, providing more realism. We also studied how differences in shape and location of PSIs affect their accuracy, concluding that those with smaller sizes and a homogeneous target surface are more prone to errors. Our study presents promising results that prove AR’s potential to overcome the present limitations of PSIs conveniently and effectively.


2015 ◽  
Vol 77 (27) ◽  
Author(s):  
Chew Sze Soon ◽  
Raja Ariffin Raja Ghazilla ◽  
Yap Hwa Jen ◽  
Pai Yun Suen

Human factor studies such as ergonomic evaluation become increasingly important in the engineering, automotive, designing and support of new advance products. Creating in-car gadgets that can be worked inside appropriate safety bound is an ergonomic issue. Several tools and methods have been developed for ergonomic evaluation. However, there are several factors that influence the difficulty of such evaluations, such as the subjectivity of comfort, high cost of mock-up systems and computerized tools, and the disadvantage of reconfiguring adjustments. The proposed system allows the user or engineer to obtain the three-dimensional visual model with the aid of additional equipment that includes a Augmented Reality head mounted display (HMD) to reduce the components of physical prototype. The user or engineer is able to determine the position of interior components to determine the most comfortable ergonomic reaching zone.  


2021 ◽  
Vol 51 (2) ◽  
pp. E20
Author(s):  
Gorkem Yavas ◽  
Kadri Emre Caliskan ◽  
Mehmet Sedat Cagli

OBJECTIVE The aim of this study was to assess the precision and feasibility of 3D-printed marker–based augmented reality (AR) neurosurgical navigation and its use intraoperatively compared with optical tracking neuronavigation systems (OTNSs). METHODS Three-dimensional–printed markers for CT and MRI and intraoperative use were applied with mobile devices using an AR light detection and ranging (LIDAR) camera. The 3D segmentations of intracranial tumors were created with CT and MR images, and preoperative registration of the marker and pathology was performed. A patient-specific, surgeon-facilitated mobile application was developed, and a mobile device camera was used for neuronavigation with high accuracy, ease, and cost-effectiveness. After accuracy values were preliminarily assessed, this technique was used intraoperatively in 8 patients. RESULTS The mobile device LIDAR camera was found to successfully overlay images of virtual tumor segmentations according to the position of a 3D-printed marker. The targeting error that was measured ranged from 0.5 to 3.5 mm (mean 1.70 ± 1.02 mm, median 1.58 mm). The mean preoperative preparation time was 35.7 ± 5.56 minutes, which is longer than that for routine OTNSs, but the amount of time required for preoperative registration and the placement of the intraoperative marker was very brief compared with other neurosurgical navigation systems (mean 1.02 ± 0.3 minutes). CONCLUSIONS The 3D-printed marker–based AR neuronavigation system was a clinically feasible, highly precise, low-cost, and easy-to-use navigation technique. Three-dimensional segmentation of intracranial tumors was targeted on the brain and was clearly visualized from the skin incision to the end of surgery.


2019 ◽  
Vol 9 (6) ◽  
pp. 1182 ◽  
Author(s):  
Hongyue Gao ◽  
Fan Xu ◽  
Jicheng Liu ◽  
Zehang Dai ◽  
Wen Zhou ◽  
...  

In this paper, we propose a holographic three-dimensional (3D) head-mounted display based on 4K-spatial light modulators (SLMs). This work is to overcome the limitation of stereoscopic 3D virtual reality and augmented reality head-mounted display. We build and compare two systems using 2K and 4K SLMs with pixel pitches 8.1 μm and 3.74 μm, respectively. One is a monocular system for each eye, and the other is a binocular system using two tiled SLMs for two eyes. The viewing angle of the holographic head-mounted 3D display is enlarged from 3.8 ∘ to 16.4 ∘ by SLM tiling, which demonstrates potential applications of true 3D displays in virtual reality and augmented reality.


2016 ◽  
Vol 13 (3) ◽  
pp. 186-201 ◽  
Author(s):  
Junya Kawai ◽  
Hiroyuki Mitsuhara ◽  
Masami Shishibori

Purpose Evacuation drills should be more realistic and interactive. Focusing on situational and audio-visual realities and scenario-based interactivity, the authors have developed a game-based evacuation drill (GBED) system that presents augmented reality (AR) materials on tablet computers. The paper's current research purpose is to improve visual reality (AR materials) in our GBED system. Design/methodology/approach The author's approach is to develop a new GBED system that superimposes digital objects [e.g. three-dimensional computer graphics (3DCG) elements] onto real-time vision using a marker-based AR library, a binocular opaque head-mounted display (HMD) and other current easily available technologies. Findings The findings from a trial experiment are that the new GBED system can improve visual reality and is appropriate for disaster education. However, a few problems remain for practical use. Research limitations/implications When using the GBED system, participants (i.e. HMD wearers) can suffer from 3D sickness and have difficulty in moving. These are important safety problems in HMD-based systems. Social implications The combination of AR and HMDs for GBEDs (i.e. integrating virtual and real worlds) will raise questions about its merits (pros and cons). Originality/value The originality of the research is the combination of AR and an HMD to a GBED, which has previously been realized primarily as simulation games in virtual worlds. The authors believe that our research has the potential to expand disaster education.


2018 ◽  
Vol 15 (5) ◽  
pp. 551-556 ◽  
Author(s):  
Keisuke Maruyama ◽  
Eiju Watanabe ◽  
Taichi Kin ◽  
Kuniaki Saito ◽  
Atsushi Kumakiri ◽  
...  

Abstract BACKGROUND Wearable devices with heads-up displays or smart glasses can overlay images onto the sight of the wearer. This technology has never been applied to surgical navigation. OBJECTIVE To assess the applicability and accuracy of smart glasses for augmented reality (AR)-based neurosurgical navigation. METHODS Smart glasses were applied to AR-based neurosurgical navigation. Three-dimensional computer graphics were created based on preoperative magnetic resonance images and visualized in see-through smart glasses. Optical markers were attached to the smart glasses and the patient's head for accurate navigation. Two motion capture cameras were used for registration and continuous monitoring of the location of the smart glasses in relation to the patient's head. After the accuracy was assessed with a phantom, this technique was applied in 2 patients with brain tumors located in the brain surface. RESULTS A stereoscopic view by image overlay through the smart glasses was successful in the phantom and in both patients. Hands-free neuronavigation inside the operative field was available from any angles and distances. The targeting error in the phantom measured in 75 points ranged from 0.2 to 8.1 mm (mean 3.1 ± 1.9 mm, median 2.7 mm). The intraoperative targeting error between the visualized and real locations in the 2 patients (measured in 40 points) ranged from 0.6 to 4.9 mm (mean 2.1 ± 1.1 mm, median 1.8 mm). CONCLUSION Smart glasses enabled AR-based neurosurgical navigation in a hands-free fashion. Stereoscopic computer graphics of targeted brain tumors corresponding to the surgical field were clearly visualized during surgery.


Sign in / Sign up

Export Citation Format

Share Document