surgical navigation system
Recently Published Documents


TOTAL DOCUMENTS

166
(FIVE YEARS 16)

H-INDEX

13
(FIVE YEARS 0)

Nanoscale ◽  
2022 ◽  
Author(s):  
Xiaojie Zhang ◽  
Changsheng Zhou ◽  
Fanghua Wu ◽  
Chang Gao ◽  
Qianqian Liu ◽  
...  

Abstract Long-term unsolved health problems from pre-/Intra-/postoperative complications and thermal ablation complications pose threats to liver cancer patients. To reduce the threats, we propose a multimodal-imaging guided surgical navigation system...



2021 ◽  
Vol 70 (12) ◽  
pp. 1898-1905
Author(s):  
Min-Hyuk Choi ◽  
Won-Jin Yi ◽  
Si-Eun Choi ◽  
Se-Ryong Kang ◽  
Ji-Yong Yoo ◽  
...  


2021 ◽  
Author(s):  
Sunghwan Lim ◽  
Junhyoung Ha ◽  
Seongmin Yoon ◽  
Young Tae Sohn ◽  
Joonho Seo ◽  
...  


2021 ◽  
Author(s):  
Shin-Yan Chiou ◽  
Zhi-Yue Zhang ◽  
Hao-Li Liu ◽  
Jiun-Lin Yan ◽  
Kuo-Chen Wei ◽  
...  

Abstract Augmented reality surgery systems have played an important role in assisting physicians in their operations. However, applying the system to brain neurosurgery is challenging. In addition to using the augmented reality technology to display the 3D position of the surgical target position in real time, we also need to consider the display of the scalpel entry point and scalpel orientation, and their accurate superposition on patients. This paper proposes a mixed reality surgical navigation system, which accurately superimposes the surgical target position, scalpel entry point and scalpel direction on a patient's head and displays it on a tablet, facilitating the visual and intuitive way for the brain neurosurgery. Based on the current neurosurgery navigation system, we integrated mixed reality technology on it. We first independently tested the accuracy of the optical measurement system-NDI Polaris Vicra, and then designed functions that a physician can quickly point out the surgical target position and decide an entry point position, and a tablet can display the superimposed images of surgical target, entry point, and scalpel, and perform the correctness of scalpel orientation. Then we used the Dicom of the patient CT to create a phantom and it’s AR model, imported this AR model into the APP, and installed and executed the APP on the tablet. In the preoperative phase, the technician first superimposed the AR image of head and the scalpel in 5-7 minutes, and then the physician point out and set the target position and entry point position in 2 minutes on a tablet, which then dynamically displayed the superimposed image of the head, target position, entry point position, and scalpel (including the scalpel tip and scalpel spatial direction). We successfully conducted multiple experiments on phantom and six experiments on clinical neurosurgical EVD practice. In the 2D-plane-superposition model (n = 60), the optical measurement system (NDI Polaris Vicra) was feasible of the visualization space with high accuracy (mean error ± standard deviation (SD): 2.013 ± 1.118 mm). In the clinical trials in the hospital (n = 4), the average technician preparation time was 6.317 minutes. The average time (n = 4) required for the physician to set the target position and the entry-point position and accurately overlay the orientation with a surgical stick was 3.5 minutes. In the preparation phase, the average time required for the Dicom image processing and program importing was 120 ± 30 minutes. The designed mixed reality optical surgical navigation system can successfully achieve clinical accuracy, and guide physicians to perform brain surgery visually and intuitively. In addition, the physician can use the APP of the tablet device to instantly obtain Dicom images with the designated patient, change the position of the surgical entry point, and instantly obtain the accurate surgical path and surgical angle after the modification. This design can be used as the basis for various AR or MR brain surgery navigation systems in the future.



2021 ◽  
Vol 15 (1) ◽  
Author(s):  
Ryo Miyazaki ◽  
Akinori Iwasaki ◽  
Fumi Nakai ◽  
Minoru Miyake

Abstract Background Computer-assisted surgical navigation systems were initially introduced for use in neurosurgery and have been applied in craniomaxillofacial surgery for 20 years. The anatomy of the oral and maxillofacial region is relatively complicated and includes critical contiguous organs. A surgical navigation system makes it possible to achieve real-time positioning during surgery and to transfer the preoperative design to the actual operation. Temporomandibular joint ankylosis limits the mouth opening, deforms the face, and causes an increase in dental caries. Although early surgical treatment is recommended, there is controversy regarding the optimal surgical technique. In addition, pediatric treatment is difficult because in children the skull is not as wide as it is in adults. There are few reports of pediatric temporomandibular joint ankylosis surgery performed with a navigation system. Case presentation A 7-year-old Japanese girl presented severe restriction of the opening and lateral movement of her mouth due to a temporomandibular joint bruise experienced 2 years earlier. Computed tomography and magnetic resonance imaging demonstrated left condyle deformation, disappearance of the joint cavity, and a 0.7-mm skull width. We diagnosed left temporomandibular joint ankylosis and performed a temporomandibular joint ankylosis arthroplasty using a surgical navigation system in order to avoid damage to the patient's brain. A preauricular incision was applied, and interpositional gap arthroplasty with temporal muscle was performed. After the surgery, the maximum aperture was 38 mm, and the limitation of the lateral movement was eliminated. Conclusions A navigation system is helpful for confirming the exact target locations and ensuring safe surgery. In our patient's case, pediatric temporomandibular joint ankylosis surgery was performed using a navigation system without complications.



2021 ◽  
Author(s):  
Yu-Ying Chu ◽  
Jia-Ruei Yang ◽  
Han Tsung Liao ◽  
Bo-Ru Lai

Abstract This study analyzed the outcomes of zygomatico-orbital fracture reconstruction using the real-time navigation system with intraoperative three-dimensional (3D) C-arm computed tomography (CT). Fifteen patients with zygomatico-orbital or isolated orbital/zygoma fractures were enrolled in this prospective cohort. For zygoma reduction, the displacement at five key sutures and the differences between preoperative and intraoperative CT images were compared. For orbital reconstruction, the bilateral orbital volume differences in the anterior, middle, and posterior angles over the medial transitional buttress were measured. Two patients required implant adjustment once after the intraoperative 3D C-arm assessment. On comparing the preoperative and postoperative findings for the zygoma, the average sum of displacement was 19.48 (range, 5.1–34.65) vs. ±1.96 (0–3.95) mm (P < 0.001) and the deviation index was 13.56 (10–24.35) vs. 2.44 (0.6–4.85) (P < 0.001). For the orbit, the mean preoperative to postoperative bilateral orbital volume difference was 3.93 (0.35–10.95) vs. 1.05 (0.12–3.61) mm3 (P <0.001). The mean difference in the bilateral angles at the transition buttress was significantly decreased postoperatively at the middle and posterior one-third. The surgical navigation system with the intraoperative 3D C-arm can effectively improve the accuracy of zygomatico-orbital fracture reconstruction and decrease implant adjustment times.



PLoS ONE ◽  
2021 ◽  
Vol 16 (4) ◽  
pp. e0250558
Author(s):  
Harley H. L. Chan ◽  
Stephan K. Haerle ◽  
Michael J. Daly ◽  
Jinzi Zheng ◽  
Lauren Philp ◽  
...  

An integrated augmented reality (AR) surgical navigation system that potentially improves intra-operative visualization of concealed anatomical structures. Integration of real-time tracking technology with a laser pico-projector allows the surgical surface to be augmented by projecting virtual images of lesions and critical structures created by multimodality imaging. We aim to quantitatively and qualitatively evaluate the performance of a prototype interactive AR surgical navigation system through a series of pre-clinical studies. Four pre-clinical animal studies using xenograft mouse models were conducted to investigate system performance. A combination of CT, PET, SPECT, and MRI images were used to augment the mouse body during image-guided procedures to assess feasibility. A phantom with machined features was employed to quantitatively estimate the system accuracy. All the image-guided procedures were successfully performed. The tracked pico-projector correctly and reliably depicted virtual images on the animal body, highlighting the location of tumour and anatomical structures. The phantom study demonstrates the system was accurate to 0.55 ± 0.33mm. This paper presents a prototype real-time tracking AR surgical navigation system that improves visualization of underlying critical structures by overlaying virtual images onto the surgical site. This proof-of-concept pre-clinical study demonstrated both the clinical applicability and high precision of the system which was noted to be accurate to <1mm.



Author(s):  
Hiroshi Noborio ◽  
◽  
Katsuhiko Onishi ◽  
Masanao Koeda ◽  
Kaoru Watanabe ◽  
...  


Sign in / Sign up

Export Citation Format

Share Document