scholarly journals Comparison of manual and semi-automatic registration in augmented reality image-guided liver surgery: a clinical feasibility study

2020 ◽  
Vol 34 (10) ◽  
pp. 4702-4711
Author(s):  
C. Schneider ◽  
S. Thompson ◽  
J. Totz ◽  
Y. Song ◽  
M. Allam ◽  
...  

Abstract Background The laparoscopic approach to liver resection may reduce morbidity and hospital stay. However, uptake has been slow due to concerns about patient safety and oncological radicality. Image guidance systems may improve patient safety by enabling 3D visualisation of critical intra- and extrahepatic structures. Current systems suffer from non-intuitive visualisation and a complicated setup process. A novel image guidance system (SmartLiver), offering augmented reality visualisation and semi-automatic registration has been developed to address these issues. A clinical feasibility study evaluated the performance and usability of SmartLiver with either manual or semi-automatic registration. Methods Intraoperative image guidance data were recorded and analysed in patients undergoing laparoscopic liver resection or cancer staging. Stereoscopic surface reconstruction and iterative closest point matching facilitated semi-automatic registration. The primary endpoint was defined as successful registration as determined by the operating surgeon. Secondary endpoints were system usability as assessed by a surgeon questionnaire and comparison of manual vs. semi-automatic registration accuracy. Since SmartLiver is still in development no attempt was made to evaluate its impact on perioperative outcomes. Results The primary endpoint was achieved in 16 out of 18 patients. Initially semi-automatic registration failed because the IGS could not distinguish the liver surface from surrounding structures. Implementation of a deep learning algorithm enabled the IGS to overcome this issue and facilitate semi-automatic registration. Mean registration accuracy was 10.9 ± 4.2 mm (manual) vs. 13.9 ± 4.4 mm (semi-automatic) (Mean difference − 3 mm; p = 0.158). Surgeon feedback was positive about IGS handling and improved intraoperative orientation but also highlighted the need for a simpler setup process and better integration with laparoscopic ultrasound. Conclusion The technical feasibility of using SmartLiver intraoperatively has been demonstrated. With further improvements semi-automatic registration may enhance user friendliness and workflow of SmartLiver. Manual and semi-automatic registration accuracy were comparable but evaluation on a larger patient cohort is required to confirm these findings.

HPB ◽  
2019 ◽  
Vol 21 ◽  
pp. S671-S672
Author(s):  
C. Schneider ◽  
S. Thompson ◽  
K. Gurusamy ◽  
D. Stoyanov ◽  
D.J. Hawkes ◽  
...  

2015 ◽  
Vol 11 (4) ◽  
pp. 504-511 ◽  
Author(s):  
Sven R Kantelhardt ◽  
Angelika Gutenberg ◽  
Axel Neulen ◽  
Naureen Keric ◽  
Mirjam Renovanz ◽  
...  

Abstract BACKGROUND Information supplied by an image-guidance system can be superimposed on the operating microscope oculars or on a screen, generating augmented reality. Recently, the outline of a patient's head and skull, injected in the oculars of a standard operating microscope, has been used to check the registration accuracy of image guidance. OBJECTIVE To propose the use of the brain surface relief and superficial vessels for real-time intraoperative visualization and image-guidance accuracy and for intraoperative adjustment for brain shift. METHODS A commercially available image-guidance system and a standard operating microscope were used. Segmentation of the brain surface and cortical blood vessel relief was performed manually on preoperative computed tomography and magnetic resonance images. The overlay of segmented digital and real operating-microscope images was used to monitor image-guidance accuracy. Adjustment for brain shift was performed by manually matching digital images on real structures. RESULTS Experimental manipulation on a phantom proved that the brain surface relief could be used to restore accuracy if the primary registration shifted. Afterward, the technique was used to assist during surgery of 5 consecutive patients with 7 deep-seated brain tumors. The brain surface relief could be successfully used to monitor registration accuracy after craniotomy and during the whole procedure. If a certain degree of brain shift occurred after craniotomy, the accuracy could be restored in all cases, and corticotomies were correctly centered in all cases. CONCLUSION The proposed method was easy to perform and augmented image-guidance accuracy when operating on small deep-seated lesions.


Author(s):  
Gerard M. Guiraudon ◽  
Douglas L. Jones ◽  
Daniel Bainbridge ◽  
Cristian Linte ◽  
Danielle Pace ◽  
...  

ObjectiveWe report our experience with ultrasound augmented reality (US-AR) guidance for mitral valve prosthesis (MVP) implantation in the pig using off-pump, closed, beating intracardiac access through the Guiraudon Universal Cardiac Introducer attached to the left atrial appendage.MethodsBefore testing US-AR guidance, a feasibility pilot study on nine pigs was performed using US alone. US-AR guidance, tested on a heart phantom, was subsequently used in three pigs (~65 kg) using a tracked transesophageal echocardiography probe, augmented with registration of a 3D computed tomography scan, and virtual representation of the MVP and clip-delivering tool (Clipper); three pigs were used to test feature-based registration.ResultsNavigation of the MVP was facilitated by the 3D anatomic display. AR displayed the MVP and the Clipper within the Atamai Viewer, with excellent accuracy for tool placement. Positioning the Clipper was hampered by the design of the MVP holder and Clipper. These limitations were well displayed by AR, which provided guidance for improved design of tools.ConclusionsUS-AR provided informative image guidance. It documented the flaws of the current implantation technology. This information could not be obtained by any other method of evaluation. These evaluations provided guidance for designing an integrated tool: combining an unobtrusive valve holder that allows the MVP to function properly as soon as positioned, and an anchoring system, with clips that can be released one at a time, and retracted if necessary, for optimal results. The portability of Real-time US-AR may prove to be the ideal practical image guidance system for all closed intracardiac interventions.


2003 ◽  
Vol 10 (2) ◽  
pp. 226-230 ◽  
Author(s):  
Hiroyuki Nakagawa ◽  
Mikio Kamimura ◽  
Shigeharu Uchiyama ◽  
Kenji Takahara ◽  
Toshiro Itsubo ◽  
...  

Author(s):  
Nina Montaña-Brown ◽  
João Ramalhinho ◽  
Moustafa Allam ◽  
Brian Davidson ◽  
Yipeng Hu ◽  
...  

Abstract Purpose: Registration of Laparoscopic Ultrasound (LUS) to a pre-operative scan such as Computed Tomography (CT) using blood vessel information has been proposed as a method to enable image-guidance for laparoscopic liver resection. Currently, there are solutions for this problem that can potentially enable clinical translation by bypassing the need for a manual initialisation and tracking information. However, no reliable framework for the segmentation of vessels in 2D untracked LUS images has been presented. Methods: We propose the use of 2D UNet for the segmentation of liver vessels in 2D LUS images. We integrate these results in a previously developed registration method, and show the feasibility of a fully automatic initialisation to the LUS to CT registration problem without a tracking device. Results: We validate our segmentation using LUS data from 6 patients. We test multiple models by placing patient datasets into different combinations of training, testing and hold-out, and obtain mean Dice scores ranging from 0.543 to 0.706. Using these segmentations, we obtain registration accuracies between 6.3 and 16.6 mm in 50% of cases. Conclusions: We demonstrate the first instance of deep learning (DL) for the segmentation of liver vessels in LUS. Our results show the feasibility of UNet in detecting multiple vessel instances in 2D LUS images, and potentially automating a LUS to CT registration pipeline.


HPB ◽  
2019 ◽  
Vol 21 ◽  
pp. S973
Author(s):  
C. Schneider ◽  
S. Thompson ◽  
K. Gurusamy ◽  
D. Stoyanov ◽  
D.J. Hawkes ◽  
...  

Author(s):  
Fabian Joeres ◽  
Tonia Mielke ◽  
Christian Hansen

Abstract Purpose Resection site repair during laparoscopic oncological surgery (e.g. laparoscopic partial nephrectomy) poses some unique challenges and opportunities for augmented reality (AR) navigation support. This work introduces an AR registration workflow that addresses the time pressure that is present during resection site repair. Methods We propose a two-step registration process: the AR content is registered as accurately as possible prior to the tumour resection (the primary registration). This accurate registration is used to apply artificial fiducials to the physical organ and the virtual model. After the resection, these fiducials can be used for rapid re-registration (the secondary registration). We tested this pipeline in a simulated-use study with $$N=18$$ N = 18 participants. We compared the registration accuracy and speed for our method and for landmark-based registration as a reference. Results Acquisition of and, thereby, registration with the artificial fiducials were significantly faster than the initial use of anatomical landmarks. Our method also had a trend to be more accurate in cases in which the primary registration was successful. The accuracy loss between the elaborate primary registration and the rapid secondary registration could be quantified with a mean target registration error increase of 2.35 mm. Conclusion This work introduces a registration pipeline for AR navigation support during laparoscopic resection site repair and provides a successful proof-of-concept evaluation thereof. Our results indicate that the concept is better suited than landmark-based registration during this phase, but further work is required to demonstrate clinical suitability and applicability.


Author(s):  
Ruotong Li ◽  
Tianpei Yang ◽  
Weixin Si ◽  
Xiangyun Liao ◽  
Qiong Wang ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document