needle placement
Recently Published Documents


TOTAL DOCUMENTS

423
(FIVE YEARS 63)

H-INDEX

35
(FIVE YEARS 2)

Author(s):  
Stephanie R. Albin ◽  
Larisa R. Hoffman ◽  
Cameron W. MacDonald ◽  
Micah Boriack ◽  
Lauren Heyn ◽  
...  

2021 ◽  
pp. 1-1
Author(s):  
Nishchint Sharma ◽  
Bharti Choudhary

Use of USG in regional nerve blocks is increasing day by day. With the help of USG clinician can view real time image of patient’s anatomy, which offers a new standard in nerve location and needle placement. It allows direct visualization of local anesthetic spread around the nerve. USG guided nerve blocks allow reliable and safe anaesthesia and analgesia. USG is a blessing in a way that, it offers high success rate with low complications, in regional nerve blocks.


2021 ◽  
Vol 7 (2) ◽  
pp. 219-222
Author(s):  
Hanbal Arif ◽  
Uwe Bernd Liehr ◽  
Johann Jakob Wendler ◽  
Michael Friebe ◽  
Axel Boese

Abstract Irreversible Electroporation (IRE) is a non-thermal tumor ablation treatment applicable for prostate cancer. IRE uses ultra-short but strong electrical pulses to destroy cancer cells nonthermally [1]. Clinically available IRE therapy requires two or more needle electrodes placed around the target tissue to apply the electric fields. A pre-requirement to achieve successful and effective ablation is an accurate and parallel needle placement to cover the tumor zone. Differences in tissue density, organ surface curvature as well as organ and patient motion in combination with long and highly flexible needle electrodes causes’ difficulties to achieve the desired target accuracy during needle placement process. We propose a concept of a shooting mechanism in combination with a grid template support to improve the parallel needle placement process for prostate cancer treatment. Instead of conventionally inserting the needle in the body by hand, it can be placed with high speed using a shooting device setup, that works similar like a biopsy gun.


2021 ◽  
Vol 7 (2) ◽  
pp. 472-475
Author(s):  
Maximilian Neidhardt ◽  
Stefan Gerlach ◽  
Max-Heinrich Laves ◽  
Sarah Latus ◽  
Carolin Stapper ◽  
...  

Abstract Needles are key tools to realize minimally invasive interventions. Physicians commonly rely on subjectively perceived insertion forces at the distal end of the needle when advancing the needle tip to the desired target. However, detecting tissue transitions at the distal end of the needle is difficult since the sensed forces are dominated by shaft forces. Disentangling insertion forces has the potential to substantially improve needle placement accuracy.We propose a collaborative system for robotic needle insertion, relaying haptic information sensed directly at the needle tip to the physician by haptic feedback through a light weight robot. We integrate optical fibers into medical needles and use optical coherence tomography to image a moving surface at the tip of the needle. Using a convolutional neural network, we estimate forces acting on the needle tip from the optical coherence tomography data. We feed back forces estimated at the needle tip for real time haptic feedback and robot control. When inserting the needle at constant velocity, the force change estimated at the tip when penetrating tissue layers is up to 94% between deep tissue layers compared to the force change at the needle handle of 2.36 %. Collaborative needle insertion results in more sensible force change at tissue transitions with haptic feedback from the tip (49.79 ± 25.51)% compared to the conventional shaft feedback (15.17 ± 15.92) %. Tissue transitions are more prominent when utilizing forces estimated at the needle tip compared to the forces at the needle shaft, indicating that a more informed advancement of the needle is possible with our system.


2021 ◽  
Vol 7 (2) ◽  
pp. 779-782
Author(s):  
Stefan Gerlach ◽  
Maximilian Neidhardt ◽  
Thorben Weiß ◽  
Max-Heinrich Laves ◽  
Carolin Stapper ◽  
...  

Abstract Understanding the underlying pathology in different tissues and organs is crucial when fighting pandemics like COVID-19. During conventional autopsy, large tissue sample sets of multiple organs can be collected from cadavers. However, direct contact with an infectious corpse is associated with the risk of disease transmission and relatives of the deceased might object to a conventional autopsy. To overcome these drawbacks, we consider minimally invasive autopsies with robotic needle placement as a practical alternative. One challenge in needle based biopsies is avoidance of dense obstacles, including bones or embedded medical devices such as pacemakers. We demonstrate an approach for automated planning and visualising suitable needle insertion points based on computed tomography (CT) scans. Needle paths are modeled by a line between insertion and target point and needle insertion path occlusion from obstacles is determined by using central projections from the biopsy target to the surface of the skin. We project the maximum and minimum CT attenuation, insertion depth, and standard deviation of CT attenuation along the needle path and create two-dimensional intensity-maps projected on the skin. A cost function considering these metrics is introduced and minimized to find an optimal biopsy needle path. Furthermore, we disregard insertion points without sufficient room for needle placement. For visualisation, we display the color-coded cost function so that suitable points for needle insertion become visible. We evaluate our system on 10 post mortem CTs with six biopsy targets in abdomen and thorax annotated by medical experts. For all patients and targets an optimal insertion path is found. The mean distance to the target ranges from (49.9 ± 12.9)mm for the spleen to (90.1 ± 25.8)mm for the pancreas.


Author(s):  
Michael Kostrzewa ◽  
Andreas Rothfuss ◽  
Torben Pätz ◽  
Markus Kühne ◽  
Stefan O. Schoenberg ◽  
...  

Abstract Purpose The study aimed to evaluate a new robotic assistance system (RAS) for needle placement in combination with a multi-axis C-arm angiography system for cone-beam computed tomography (CBCT) in a phantom setting. Materials and Methods The RAS consisted of a tool holder, dedicated planning software, and a mobile platform with a lightweight robotic arm to enable image-guided needle placement in conjunction with CBCT imaging. A CBCT scan of the phantom was performed to calibrate the robotic arm in the scan volume and to plan the different needle trajectories. The trajectory data were sent to the robot, which then positioned the tool holder along the trajectory. A 19G needle was then manually inserted into the phantom. During the control CBCT scan, the exact needle position was evaluated and any possible deviation from the target lesion measured. Results In total, 16 needle insertions targeting eight in- and out-of-plane sites were performed. Mean angular deviation from planned trajectory to actual needle trajectory was 1.12°. Mean deviation from target point and actual needle tip position was 2.74 mm, and mean deviation depth from the target lesion to the actual needle tip position was 2.14 mm. Mean time for needle placement was 361 s. Only differences in time required for needle placement between in- and out-of-plane trajectories (337 s vs. 380 s) were statistically significant (p = 0.0214). Conclusion Using this RAS for image-guided percutaneous needle placement with CBCT was precise and efficient in the phantom setting.


2021 ◽  
Vol 7 (1) ◽  
pp. 126-129
Author(s):  
Eva Currle ◽  
Johannes Hemm ◽  
Armin Schäfer ◽  
Philipp Beckerle ◽  
Johannes Horsch ◽  
...  

Abstract Robotic assistance systems for surgery enable fast and precise interventions with reduced complication rates. However, these benefits are accompanied by a more complex operating room (OR) and the risk of collision with robotic assistance systems. Current strategies for collision avoidance and minimizing possible injuries require the adaptation of robotic trajectories and a computational model of the surroundings. In contrast, this work presents a novel companion system for collision avoidance without influencing robotic trajectories. The companion system consists of a preoperative planning application and an augmented reality application for intraoperative support. The companion system visualizes the workflow within the OR and allows robot movements to be seen virtually, before they are executed by the actual robotic assistance system. Preliminary experiments with users imply that the companion system leads to a positive user experience, enables users to follow a predefined workflow in the OR, but requires further refinement to improve accuracy for practical collision avoidance.


2021 ◽  
Vol 5 (1) ◽  
Author(s):  
T. Boers ◽  
S. J. Braak ◽  
M. Versluis ◽  
S. Manohar

Abstract Background Two-dimensional (2D) ultrasound is well established for thyroid nodule assessment and treatment guidance. However, it is hampered by a limited field of view and observer variability that may lead to inaccurate nodule classification and treatment. To cope with these limitations, we investigated the use of real-time three-dimensional (3D) ultrasound to improve the accuracy of volume estimation and needle placement during radiofrequency ablation. We assess a new 3D matrix transducer for nodule volume estimation and image-guided radiofrequency ablation. Methods Thirty thyroid nodule phantoms with thermochromic dye underwent volume estimation and ablation guided by a 2D linear and 3D mechanically-swept array and a 3D matrix transducer. Results The 3D matrix nodule volume estimations had a lower median difference with the ground truth (0.4 mL) compared to the standard 2D approach (2.2 mL, p < 0.001) and mechanically swept 3D transducer (2.0 mL, p = 0.016). The 3D matrix-guided ablation resulted in a similar nodule ablation coverage when compared to 2D-guidance (76.7% versus 80.8%, p = 0.542). The 3D mechanically swept transducer performed worse (60.1%, p = 0.015). However, 3D matrix and 2D guidance ablations lead to a larger ablated volume outside the nodule than 3D mechanically swept (5.1 mL, 4.2 mL (p = 0.274), 0.5 mL (p < 0.001), respectively). The 3D matrix and mechanically swept approaches were faster with 80 and 72.5 s/mL ablated than 2D with 105.5 s/mL ablated. Conclusions The 3D matrix transducer estimates volumes more accurately and can facilitate accurate needle placement while reducing procedure time.


2021 ◽  
pp. e20200137
Author(s):  
Alexandra Beaulieu ◽  
Stephanie Nykamp ◽  
John Phillips ◽  
Luis G. Arroyo ◽  
Judith Koenig ◽  
...  

Intra-articular injections are routinely performed to alleviate pain and inflammation associated with osteoarthritis in horses. Intra-articular injections require accurate needle placement to optimize clinical outcomes and minimize complications. This study’s objectives were to develop and validate a three-dimensional (3D) printed model of an equine cervical articular process joint to teach ultrasound-guided injections. Five identical models of an equine cervical articular process joint were 3D printed and embedded in 10% ballistic gelatin. Experts’ and novices’ ability to successfully insert a needle into the joint space of the model using ultrasound guidance was assessed and graded using an objective structured clinical examination (OSCE). Scores from experts and novices were compared to evaluate the construct validity of the model. Participants also answered a survey assessing the face and content validity of the model. Experts required less time (22.51 seconds) for correct needle placement into the model joint space than novices (35.96 seconds); however, this difference was not significant ( p = .53). Experts’ median total OSCE score (14) was significantly higher ( p = .03) than novices’ (12), supporting the model’s construct validity. Participants agreed on the face and content validity of the model by grading all survey questions greater than 7 on a 10-point Likert-type scale. In summary, we successfully developed a 3D printed model of an equine cervical articular process joint, partially demonstrated the construct validity of the model, and proved the face and content validity of this new training tool.


Sign in / Sign up

Export Citation Format

Share Document