Human robot interaction for manipulation tasks based on stroke gesture recognition
Purpose The purpose of this paper is to extend the usage of stroke gestures in manipulation tasks to make the interaction between human and robot more efficient. Design/methodology/approach In this paper, a set of stroke gestures is designed for typical manipulation tasks. A gesture recognition and parameter extraction system is proposed to exploit the information in stroke gestures drawn by the users. Findings The results show that the designed gesture recognition subsystem can reach a recognition accuracy of 99.00 per cent. The parameter extraction subsystem can successfully extract parameters needed for typical manipulation tasks with a success rate about 86.30 per cent. The system shows an acceptable performance in the experiments. Practical implications Using stroke gesture in manipulation tasks can make the transmission of human intentions to the robots more efficient. The proposed gesture recognition subsystem is based on convolutional neural network which is robust to different input. The parameter extraction subsystem can extract the spatial information encoded in stroke gestures. Originality/value The author designs stroke gestures for manipulation tasks which is an extension of the usage of stroke gestures. The proposed gesture recognition and parameter extraction system can make use of stroke gestures to get the type of the task and important parameters for the task simultaneously.