Interactive force-sensing feedback system for remote robotic laparoscopic surgery

2011 ◽  
Vol 34 (4) ◽  
pp. 376-387
Author(s):  
Ian Mack ◽  
Stuart Ferguson ◽  
Karen Rafferty ◽  
Stephen Potts ◽  
Alistair Dick

This paper presents the details of a combined hardware/software system, which has been developed to provide haptic feedback for teleoperated laparoscopic surgical robots. Surgical instruments incorporating quantum tunnelling composite (QTC) force measuring sensors have been developed and mounted on a pair of Mitsubishi PA-10 industrial robots. Feedback forces are rendered on pseudo-surgical instruments based on a pair of PHANTOM Omni devices, which are also used to remotely manipulate the robotic arms. Measurements of the behaviour of the QTC sensors during a simulated teleoperated procedure are given. In addition, a method is proposed that can compensate for their non-linear characteristics in order to provide a ‘realistic feel’ to the surgeon through the haptic feedback channel. The paper concludes by explaining how the force feedback channel is combined with a visual feedback channel to enable a surgeon to perform a two-handed surgical procedure better on a remote patient by more accurately controlling a pair of robot arms via a computer network.

2011 ◽  
Vol 2011 ◽  
pp. 1-8 ◽  
Author(s):  
Futoshi Kobayashi ◽  
George Ikai ◽  
Wataru Fukui ◽  
Fumio Kojima

A haptic feedback system is required to assist telerehabilitation with robot hand. The system should provide the reaction force measured in the robot hand to an operator. In this paper, we have developed a force feedback device that presents a reaction force to the distal segment of the operator's thumb, middle finger, and basipodite of the middle finger when the robot hand grasps an object. The device uses a shape memory alloy as an actuator, which affords a very compact, lightweight, and accurate device.


Author(s):  
Jean-Claude Leon ◽  
Thomas Dupeux ◽  
Jean-Rémy Chardonnet ◽  
Jérôme Perret

The simulation of grasping operations in virtual reality (VR) is required for many applications, especially in the domain of industrial product design, but it is very difficult to achieve without any haptic feedback. Force feedback on the fingers can be provided by a hand exoskeleton, but such a device is very complex, invasive, and costly. In this paper, we present a new device, called HaptiHand, which provides position and force input as well as haptic output for four fingers in a noninvasive way, and is mounted on a standard force-feedback arm. The device incorporates four independent modules, one for each finger, inside an ergonomic shape, allowing the user to generate a wide range of virtual hand configurations to grasp naturally an object. It is also possible to reconfigure the virtual finger positions when holding an object. The paper explains how the device is used to control a virtual hand in order to perform dexterous grasping operations. The structure of the HaptiHand is described through the major technical solutions required and tests of key functions serve as validation process for some key requirements. Also, an effective grasping task illustrates some capabilities of the HaptiHand.


2007 ◽  
Vol 16 (5) ◽  
pp. 459-470 ◽  
Author(s):  
Hermann Mayer ◽  
Istvan Nagy ◽  
Alois Knoll ◽  
Eva U Braun ◽  
Robert Bauernschmitt ◽  
...  

The implementation of telemanipulator systems for cardiac surgery enabled heart surgeons to perform delicate minimally invasive procedures with high precision under stereoscopic view. At present, commercially available systems do not provide force-feedback or Cartesian control for the operating surgeon. The lack of haptic feedback may cause damage to tissue and can cause breaks of suture material. In addition, minimally invasive procedures are very tiring for the surgeon due to the need for visual compensation for the missing force feedback. While a lack of Cartesian control of the end effectors is acceptable for surgeons (because every movement is visually supervised), it prevents research on partial automation. In order to improve this situation, we have built an experimental telemanipulator for endoscopic surgery that provides both force-feedback (in order to improve the feeling of immersion) and Cartesian control as a prerequisite for automation. In this article, we focus on the inclusion of force feedback and its evaluation. We completed our first bimanual system in early 2003 (EndoPAR Endoscopic Partial Autonomous Robot). Each robot arm consists of a standard robot and a surgical instrument, hence providing eight DOF that enable free manipulation via trocar kinematics. Based on the experience with this system, we introduced an improved version in early 2005. The new ARAMIS system (Autonomous Robot Assisted Minimally Invasive Surgery) has four multi-purpose robotic arms mounted on a gantry above the working space. Again, the arms are controlled by two force-feedback devices, and 3D vision is provided. In addition, all surgical instruments have been equipped with strain gauge force sensors that can measure forces along all translational directions of the instrument's shaft. Force-feedback of this system was evaluated in a scenario of robotic heart surgery, which offers an impression very similar to the standard, open procedures with high immersion. It enables the surgeon to palpate arteriosclerosis, to tie surgical knots with real suture material, and to feel the rupture of suture material. Therefore, the hypothesis that haptic feedback in the form of sensory substitution facilitates performance of surgical tasks was evaluated on the experimental platform described in the article (on the EndoPAR version). In addition, a further hypothesis was explored: The high fatigue of surgeons during and after robotic operations may be caused by visual compensation due to the lack of force-feedback (Thompson, J., Ottensmeier, M., & Sheridan, T. 1999. Human Factors in Telesurgery, Telmed Journal, 5 (2) 129–137.).


2011 ◽  
Vol 8 (2) ◽  
pp. 221-236 ◽  
Author(s):  
Christoph Staub ◽  
Keita Ono ◽  
Hermann Mayer ◽  
Alois Knoll ◽  
Heinz Ulbrich ◽  
...  

The automation of recurrent tasks and force feedback are complex problems in medical robotics. We present a novel approach that extends human-machine skill-transfer by a scaffolding framework. It assumes a consolidated working environment for both, the trainee and the trainer. The trainer provides hints and cues in a basic structure which is already understood by the learner. In this work, the scaffolding is constituted by abstract patterns, which facilitate the structuring and segmentation of information during “Learning by Demonstration” (LbD). With this concept, the concrete example of knot-tying for suturing is exemplified and evaluated. During the evaluation, most problems and failures arose due to intrinsic system imprecisions of the medical robot system. These inaccuracies were then improved by the visual guidance of the surgical instruments. While the benefits of force feedback in telesurgery has already been demonstrated and measured forces are also used during task learning, the transmission of signals between the operator console and the robot system over long-distances or across-network remote connections is still a challenge due to time-delay. Especially during incision processes with a scalpel into tissue, a delayed force feedback yields to an unpredictable force perception at the operator-side and can harm the tissue which the robot is interacting with. We propose a XFEM-based incision force prediction algorithm that simulates the incision contact-forces in real-time and compensates the delayed force sensor readings. A realistic 4-arm system for minimally invasive robotic heart surgery is used as a platform for the research.


Author(s):  
Zhenhua Zhu ◽  
Shuming Gao ◽  
Huagen Wan ◽  
Yang Luo ◽  
Wenzhen Yang

The sense of touch is an important way for humans to feel the world. It is very important to provide realistic haptic feedback in virtual assembly applications as to enhancing immersion experience and improving efficiency. This paper presents a novel approach for grasp identification and multi-finger haptic feedback for virtual assembly. Firstly, the Voxmap-PointShell (VPS) algorithm is adapted and utilized to detect collisions between a dexterous virtual hand and a mechanical component or between two mechanical components, and collision detection results are used to guide the motion of a virtual hand. Then collision forces at collision points are calculated (using Hooke Law), classified and converted. Finally, forces received at fingertips of a virtual hand are used to identify whether or not a virtual hand can grasp a mechanical component, and are mapped to exert forces at user’s fingertips with a CyberGrasp force feedback system. Our approach has been incorporated and verified in a CAVE virtual environment.


2010 ◽  
Vol 19 (5) ◽  
pp. 400-414 ◽  
Author(s):  
Andreas Tobergte

This paper presents MiroSurge, a telepresence system for minimally invasive surgery developed at the German Aerospace Center (DLR), and introduces MiroSurge's new user interaction modalities: (1) haptic feedback with software-based preservation of the fulcrum point, (2) an ultrasound-based approach to the quasi-tactile detection of pulsating vessels, and (3) a contact-free interface between surgeon and telesurgery system, where stereo vision is augmented with force vectors at the tool tip. All interaction modalities aim to increase the user's perception beyond stereo imaging by either augmenting the images or by using haptic interfaces. MiroSurge currently provides surgeons with two different interfaces. The first option, bimanual haptic interaction with force and partial tactile feedback, allows for direct perception of the remote environment. Alternatively, users can choose to control the surgical instruments by optically tracked forceps held in their hands. Force feedback is then provided in augmented stereo images by constantly updated force vectors displayed at the centers of the teleoperated instruments, regardless of the instruments' position within the video image. To determine the centerpoints of the instruments, artificial markers are attached and optically tracked. A new approach to detecting pulsating vessels beneath covering tissue with an omnidirectional ultrasound Doppler sensor is presented. The measurement results are computed and can be provided acoustically (by displaying the typical Doppler sound), optically (by augmenting the endoscopic video stream), or kinesthetically (by a gentle twitching of the haptic input devices). The control structure preserves the fulcrum point in minimally invasive surgery and user commands are followed by the surgical instrument. Haptic feedback allows the user to distinguish between interaction with soft and hard environments. The paper includes technical evaluations of the features presented, as well as an overview of the system integration of MiroSurge.


2020 ◽  
Vol 6 (3) ◽  
pp. 571-574
Author(s):  
Anna Schaufler ◽  
Alfredo Illanes ◽  
Ivan Maldonado ◽  
Axel Boese ◽  
Roland Croner ◽  
...  

AbstractIn robot-assisted procedures, the surgeon controls the surgical instruments from a remote console, while visually monitoring the procedure through the endoscope. There is no haptic feedback available to the surgeon, which impedes the assessment of diseased tissue and the detection of hidden structures beneath the tissue, such as vessels. Only visual clues are available to the surgeon to control the force applied to the tissue by the instruments, which poses a risk for iatrogenic injuries. Additional information on haptic interactions of the employed instruments and the treated tissue that is provided to the surgeon during robotic surgery could compensate for this deficit. Acoustic emissions (AE) from the instrument/tissue interactions, transmitted by the instrument are a potential source of this information. AE can be recorded by audio sensors that do not have to be integrated into the instruments, but that can be modularly attached to the outside of the instruments shaft or enclosure. The location of the sensor on a robotic system is essential for the applicability of the concept in real situations. While the signal strength of the acoustic emissions decreases with distance from the point of interaction, an installation close to the patient would require sterilization measures. The aim of this work is to investigate whether it is feasible to install the audio sensor in non-sterile areas far away from the patient and still be able to receive useful AE signals. To determine whether signals can be recorded at different potential mounting locations, instrument/tissue interactions with different textures were simulated in an experimental setup. The results showed that meaningful and valuable AE can be recorded in the non-sterile area of a robotic surgical system despite the expected signal losses.


Author(s):  
Xiaoli Zhang ◽  
Carl A. Nelson

The size and limited dexterity of current surgical robotic systems are factors which limit their usefulness. To improve the level of assimilation of surgical robots in minimally invasive surgery (MIS), a compact, lightweight surgical robotic positioning mechanism with four degrees of freedom (DOF) (three rotational DOF and one translation DOF) is proposed in this paper. This spatial mechanism based on a bevel-gear wrist is remotely driven with three rotation axes intersecting at a remote rotation center (the MIS entry port). Forward and inverse kinematics are derived, and these are used for optimizing the mechanism structure given workspace requirements. By evaluating different spherical geared configurations with various link angles and pitch angles, an optimal design is achieved which performs surgical tool positioning throughout the desired kinematic workspace while occupying a small space bounded by a hemisphere of radius 13.7 cm. This optimized workspace conservatively accounts for collision avoidance between patient and robot or internally between the robot links. This resultant mechanism is highly compact and yet has the dexterity to cover the extended workspace typically required in telesurgery. It can also be used for tool tracking and skills assessment. Due to the linear nature of the gearing relationships, it may also be well suited for implementing force feedback for telesurgery.


Machines ◽  
2020 ◽  
Vol 8 (3) ◽  
pp. 47 ◽  
Author(s):  
Luca Salvati ◽  
Matteo d’Amore ◽  
Anita Fiorentino ◽  
Arcangelo Pellegrino ◽  
Pasquale Sena ◽  
...  

In recent years, driving simulators have been widely used by automotive manufacturers and researchers in human-in-the-loop experiments, because they can reduce time and prototyping costs, and provide unlimited parametrization, more safety, and higher repeatability. Simulators play an important role in studies about driver behavior in operating conditions or with unstable vehicles. The aim of the research is to study the effects that the force feedback (f.f.b.), provided to steering wheel by a lane-keeping-assist (LKA) system, has on a driver’s response in simulators. The steering’s force feedback system is tested by reproducing the conditions of criticality of the LKA system in order to minimize the distance required to recover the driving stability as a function of set f.f.b. intensity and speed. The results, obtained in three specific criticality conditions, show that the behaviour of the LKA system, reproduced in the simulator, is not immediately understood by the driver and, sometimes, it is in opposition with the interventions performed by the driver to ensure driving safety. The results also compare the performance of the subjects, either overall and classified into subgroups, with reference to the perception of the LKA system, evaluated by means of a questionnaire. The proposed experimental methodology is to be regarded as a contribution for the integration of acceptance tests in the evaluation of automation systems.


Sign in / Sign up

Export Citation Format

Share Document