Surgical Tool Tracking and Pose Estimation in Retinal Microsurgery

Author(s):  
Nicola Rieke ◽  
David Joseph Tan ◽  
Mohamed Alsheakhali ◽  
Federico Tombari ◽  
Chiara Amat di San Filippo ◽  
...  
Author(s):  
Lin Zhang ◽  
Menglong Ye ◽  
Po-Ling Chan ◽  
Guang-Zhong Yang

Author(s):  
Josué Page Vizcaíno ◽  
Nicola Rieke ◽  
David Joseph Tan ◽  
Federico Tombari ◽  
Abouzar Eslami ◽  
...  

Author(s):  
N. Parnian ◽  
M. F. Golnaraghi

This paper represents a hybrid Vision/INS system in a microsurgical tool tracking application. Surgical MEMS devices must not only cope with all of the challenges that conventional MEMS devices have, but also address the integration of electronics and signal processing, calibration, reliability, accuracy and testing. A hybrid Vision/INS system with the integration of the Extended Kalman Filter precisely calculates 6D position-orientation of a microsurgical tool during surgery. This configuration guarantees the real-time tracking of the instrument. Ultimately, the vision system supports the IMU to deal with the drift problem but the position error increases dramatically in the absence of the vision system. In this paper, the tool motion modeling is proposed to bind the error in the acceptable range for a short period of missing data. The motion of the tool is modeled and updated at any time that the instrument is in the camera view field. This model is applied to the estimation algorithm whenever the camera is not in line of site and the optical data is missing.


2019 ◽  
Vol 2019 (14) ◽  
pp. 467-472 ◽  
Author(s):  
Zijian Zhao ◽  
Sandrine Voros ◽  
Zhaorui Chen ◽  
Xiaolin Cheng

Author(s):  
Maria Robu ◽  
Abdolrahim Kadkhodamohammadi ◽  
Imanol Luengo ◽  
Danail Stoyanov

Author(s):  
Jonas Hein ◽  
Matthias Seibold ◽  
Federica Bogo ◽  
Mazda Farshad ◽  
Marc Pollefeys ◽  
...  

Abstract Purpose:  Tracking of tools and surgical activity is becoming more and more important in the context of computer assisted surgery. In this work, we present a data generation framework, dataset and baseline methods to facilitate further research in the direction of markerless hand and instrument pose estimation in realistic surgical scenarios. Methods:  We developed a rendering pipeline to create inexpensive and realistic synthetic data for model pretraining. Subsequently, we propose a pipeline to capture and label real data with hand and object pose ground truth in an experimental setup to gather high-quality real data. We furthermore present three state-of-the-art RGB-based pose estimation baselines. Results:  We evaluate three baseline models on the proposed datasets. The best performing baseline achieves an average tool 3D vertex error of 16.7 mm on synthetic data as well as 13.8 mm on real data which is comparable to the state-of-the art in RGB-based hand/object pose estimation. Conclusion:  To the best of our knowledge, we propose the first synthetic and real data generation pipelines to generate hand and object pose labels for open surgery. We present three baseline models for RGB based object and object/hand pose estimation based on RGB frames. Our realistic synthetic data generation pipeline may contribute to overcome the data bottleneck in the surgical domain and can easily be transferred to other medical applications.


2019 ◽  
Vol 6 (6) ◽  
pp. 231-236 ◽  
Author(s):  
Eung‐Joo Lee ◽  
William Plishker ◽  
Xinyang Liu ◽  
Shuvra S. Bhattacharyya ◽  
Raj Shekhar

Sign in / Sign up

Export Citation Format

Share Document