Real-time vision-based telepresence robot hand control

Author(s):  
Chuanyun Deng ◽  
Jie Lu ◽  
Tin Lun Lam
Sensors ◽  
2021 ◽  
Vol 21 (2) ◽  
pp. 663
Author(s):  
Yuji Yamakawa ◽  
Yutaro Matsui ◽  
Masatoshi Ishikawa

In this research, we focused on Human-Robot collaboration. There were two goals: (1) to develop and evaluate a real-time Human-Robot collaborative system, and (2) to achieve concrete tasks such as collaborative peg-in-hole using the developed system. We proposed an algorithm for visual sensing and robot hand control to perform collaborative motion, and we analyzed the stability of the collaborative system and a so-called collaborative error caused by image processing and latency. We achieved collaborative motion using this developed system and evaluated the collaborative error on the basis of the analysis results. Moreover, we aimed to realize a collaborative peg-in-hole task that required a system with high speed and high accuracy. To achieve this goal, we analyzed the conditions required for performing the collaborative peg-in-hole task from the viewpoints of geometric, force and posture conditions. Finally, in this work, we show the experimental results and data of the collaborative peg-in-hole task, and we examine the effectiveness of our collaborative system.


2010 ◽  
Vol 37-38 ◽  
pp. 923-926 ◽  
Author(s):  
Ming He Jin ◽  
Dong Jian Wang ◽  
Shao Wei Fan ◽  
Zhao Peng Chen

This paper presented a control platform for HIT/DLR II robot hand, which is based on QNX real-time operating system and Simulink software. The platform accelerates the development of robot hand control algorithm and hardware-in-the-loop (HIL) testing using Model-Based Design techniques of MATLAB Simulink. This paper also validated the performance of the platform by using two applications: single-finger position control and grasp application.


1989 ◽  
Vol 9 (3) ◽  
pp. 38-43 ◽  
Author(s):  
H. Liu ◽  
T. Iberall ◽  
G.A. Bekey

2000 ◽  
Vol 2000.1 (0) ◽  
pp. 467-468
Author(s):  
Haruhisa KAWASAKI ◽  
Tatsuhisa ABE ◽  
Tetsuya MOURI ◽  
Kazunao UTIYAMA

2012 ◽  
Vol 463-464 ◽  
pp. 1147-1150 ◽  
Author(s):  
Constantin Catalin Moldovan ◽  
Ionel Staretu

Object tracking in three dimensional environments is an area of research that has attracted a lot of attention lately, for its potential regarding the interaction between man and machine. Hand gesture detection and recognition, in real time, from video stream, plays a significant role in the human-computer interaction and, on the current digital image processing applications, this represent a difficult task. This paper aims to present a new method for human hand control in virtual environments, by eliminating the need of an external device currently used for hand motion capture and digitization. A first step in this direction would be the detection of human hand, followed by the detection of gestures and their use to control a virtual hand in a virtual environment.


Sign in / Sign up

Export Citation Format

Share Document