Fast implementation of direct robot kinematics with cordic systolic arrays

1998 ◽  
Vol 67 (3-4) ◽  
pp. 239-260 ◽  
Author(s):  
Basil G. Mertzios
Author(s):  
Martin Wagner ◽  
Andreas Bihlmaier ◽  
Hannes Götz Kenngott ◽  
Patrick Mietkowski ◽  
Paul Maria Scheikl ◽  
...  

Abstract Background We demonstrate the first self-learning, context-sensitive, autonomous camera-guiding robot applicable to minimally invasive surgery. The majority of surgical robots nowadays are telemanipulators without autonomous capabilities. Autonomous systems have been developed for laparoscopic camera guidance, however following simple rules and not adapting their behavior to specific tasks, procedures, or surgeons. Methods The herein presented methodology allows different robot kinematics to perceive their environment, interpret it according to a knowledge base and perform context-aware actions. For training, twenty operations were conducted with human camera guidance by a single surgeon. Subsequently, we experimentally evaluated the cognitive robotic camera control. A VIKY EP system and a KUKA LWR 4 robot were trained on data from manual camera guidance after completion of the surgeon’s learning curve. Second, only data from VIKY EP were used to train the LWR and finally data from training with the LWR were used to re-train the LWR. Results The duration of each operation decreased with the robot’s increasing experience from 1704 s ± 244 s to 1406 s ± 112 s, and 1197 s. Camera guidance quality (good/neutral/poor) improved from 38.6/53.4/7.9 to 49.4/46.3/4.1% and 56.2/41.0/2.8%. Conclusions The cognitive camera robot improved its performance with experience, laying the foundation for a new generation of cognitive surgical robots that adapt to a surgeon’s needs.


2021 ◽  
Vol 54 (1-2) ◽  
pp. 102-115
Author(s):  
Wenhui Si ◽  
Lingyan Zhao ◽  
Jianping Wei ◽  
Zhiguang Guan

Extensive research efforts have been made to address the motion control of rigid-link electrically-driven (RLED) robots in literature. However, most existing results were designed in joint space and need to be converted to task space as more and more control tasks are defined in their operational space. In this work, the direct task-space regulation of RLED robots with uncertain kinematics is studied by using neural networks (NN) technique. Radial basis function (RBF) neural networks are used to estimate complicated and calibration heavy robot kinematics and dynamics. The NN weights are updated on-line through two adaptation laws without the necessity of off-line training. Compared with most existing NN-based robot control results, the novelty of the proposed method lies in that asymptotic stability of the overall system can be achieved instead of just uniformly ultimately bounded (UUB) stability. Moreover, the proposed control method can tolerate not only the actuator dynamics uncertainty but also the uncertainty in robot kinematics by adopting an adaptive Jacobian matrix. The asymptotic stability of the overall system is proven rigorously through Lyapunov analysis. Numerical studies have been carried out to verify efficiency of the proposed method.


2020 ◽  
Vol 1624 ◽  
pp. 042029
Author(s):  
Dongkang He ◽  
Fangping Liu ◽  
Fuchun Wang

1996 ◽  
Vol 7 (1) ◽  
pp. 7-26 ◽  
Author(s):  
F. El-Guibaly ◽  
A. Tawfik

Sign in / Sign up

Export Citation Format

Share Document