scholarly journals Situation-Specific Intention Recognition for Human-Robot Cooperation

Author(s):  
Peter Krauthausen ◽  
Uwe D. Hanebeck
Author(s):  
Yiyun Wang ◽  
Hongbing Li

In lumbar puncture surgeries, force and position information throughout the insertion procedure is vital for needle tip localization, because it reflects different tissue properties. Especially in pediatric cases, the changes are always insignificant for surgeons to sense the crucial feeling of loss of resistance. In this study, a robot system is developed to tackle the major clinical difficulties. Four different control algorithms with intention recognition ability are applied on a novel lumbar puncture robot system for better human–robot cooperation. Specific penetration detection based on force and position derivatives captures the feeling of loss of resistance, which is deemed crucial for needle tip location. Kinematic and actuation modeling provides a clear description of the hardware setup. The control algorithm experiment compares the human–robot cooperation performance of proposed algorithms. The experiment also dictates the clear role of designed penetration detection criteria in capturing the penetration, improving the success rate, and ensuring operational safety.


2021 ◽  
pp. 1-16
Author(s):  
First A. Wenbo Huang ◽  
Second B. Changyuan Wang ◽  
Third C. Hongbo Jia

Traditional intention inference methods rely solely on EEG, eye movement or tactile feedback, and the recognition rate is low. To improve the accuracy of a pilot’s intention recognition, a human-computer interaction intention inference method is proposed in this paper with the fusion of EEG, eye movement and tactile feedback. Firstly, EEG signals are collected near the frontal lobe of the human brain to extract features, which includes eight channels, i.e., AF7, F7, FT7, T7, AF8, F8, FT8, and T8. Secondly, the signal datas are preprocessed by baseline removal, normalization, and least-squares noise reduction. Thirdly, the support vector machine (SVM) is applied to carry out multiple binary classifications of the eye movement direction. Finally, the 8-direction recognition of the eye movement direction is realized through data fusion. Experimental results have shown that the accuracy of classification with the proposed method can reach 75.77%, 76.7%, 83.38%, 83.64%, 60.49%,60.93%, 66.03% and 64.49%, respectively. Compared with traditional methods, the classification accuracy and the realization process of the proposed algorithm are higher and simpler. The feasibility and effectiveness of EEG signals are further verified to identify eye movement directions for intention recognition.


Author(s):  
Kristina Tornbjerg ◽  
Anne Marie Kanstrup ◽  
Mikael B. Skov ◽  
Matthias Rehm

Sign in / Sign up

Export Citation Format

Share Document