Upper Body Gesture Recognition for Human-Robot Interaction

Author(s):  
Chi-Min Oh ◽  
Md. Zahidul Islam ◽  
Jun-Sung Lee ◽  
Chil-Woo Lee ◽  
In-So Kweon
2019 ◽  
Vol 2019 ◽  
pp. 1-12
Author(s):  
José Carlos Castillo ◽  
Fernando Alonso-Martín ◽  
David Cáceres-Domínguez ◽  
María Malfaz ◽  
Miguel A. Salichs

Human communication relies on several aspects beyond the speech. One of them is gestures as they express intentions, interests, feelings, or ideas and complement the speech. Social robots need to interpret these messages to allow a more natural Human-Robot Interaction. In this sense, our aim is to study the effect of position and speed features in dynamic gesture recognition. We use 3D information to extract the user’s skeleton and calculate the normalized position for all of its joints, and using the temporal variation of such positions, we calculate their speeds. Our three datasets are composed of 1355 samples from 30 users. We consider 14 common gestures in HRI involving upper body movements. A set of classification techniques is evaluated testing these three datasets to find what features perform better. Results indicate that the union of both speed and position achieves the best results among the three possibilities, 0.999 of F-score. The combination that performs better to detect dynamic gestures in real time is finally integrated in our social robot with a simple HRI application to run a proof of concept test to check how the proposal behaves in a realistic scenario.


2014 ◽  
Vol 23 (2) ◽  
pp. 133-154 ◽  
Author(s):  
Yang Xiao ◽  
Zhijun Zhang ◽  
Aryel Beck ◽  
Junsong Yuan ◽  
Daniel Thalmann

In this paper, a human–robot interaction system based on a novel combination of sensors is proposed. It allows one person to interact with a humanoid social robot using natural body language. The robot understands the meaning of human upper body gestures and expresses itself by using a combination of body movements, facial expressions, and verbal language. A set of 12 upper body gestures is involved for communication. This set also includes gestures with human–object interactions. The gestures are characterized by head, arm, and hand posture information. The wearable Immersion CyberGlove II is employed to capture the hand posture. This information is combined with the head and arm posture captured from Microsoft Kinect. This is a new sensor solution for human-gesture capture. Based on the posture data from the CyberGlove II and Kinect, an effective and real-time human gesture recognition method is proposed. The gesture understanding approach based on an innovative combination of sensors is the main contribution of this paper. To verify the effectiveness of the proposed gesture recognition method, a human body gesture data set is built. The experimental results demonstrate that our approach can recognize the upper body gestures with high accuracy in real time. In addition, for robot motion generation and control, a novel online motion planning method is proposed. In order to generate appropriate dynamic motion, a quadratic programming (QP)-based dual-arms kinematic motion generation scheme is proposed, and a simplified recurrent neural network is employed to solve the QP problem. The integration of a handshake within the HRI system illustrates the effectiveness of the proposed online generation method.


Sign in / Sign up

Export Citation Format

Share Document