Towards a robust drum stroke recognition system for human robot interaction

Author(s):  
Gokhan Ince ◽  
Taha Berkay Duman ◽  
Rabia Yorganci ◽  
Hatice Kose
Robotica ◽  
2014 ◽  
Vol 33 (1) ◽  
pp. 1-18 ◽  
Author(s):  
Alberto Poncela ◽  
Leticia Gallardo-Estrella

SUMMARYVerbal communication is the most natural way of human–robot interaction. Such an interaction is usually achieved by means of a human-robot interface (HRI). In this paper, a HRI is presented to teleoperate a robotic platform via the user's voice. Hence, a speech recognition system is necessary. In this work, a user-dependent acoustic model for Spanish speakers has been developed to teleoperate a robot with a set of commands. Experimental results have been successful, both in terms of a high recognition rate and the navigation of the robot under the control of the user's voice.


2014 ◽  
Vol 11 (04) ◽  
pp. 1442008 ◽  
Author(s):  
Kwang-Eun Ko ◽  
Kwee-Bo Sim

This paper is concerned with an imitative neural mechanism for recognizing behavior intention in human–robot interaction system. The intention recognition process is inspired by the neural mechanism of the mirror neurons in macaque monkey brain. We try to renovate a standard neural network with parametric biases as a reference model to imitate between sensory-motor data pair. The imitation process is primarily directed toward reproducing the goals of observed actions rather than the exact action trajectories. Several experiments and their results show that the proposed model allows to develop useful robotic application for human–robot interaction system application.


2013 ◽  
Vol 10 (01) ◽  
pp. 1350010 ◽  
Author(s):  
GINEVRA CASTELLANO ◽  
IOLANDA LEITE ◽  
ANDRÉ PEREIRA ◽  
CARLOS MARTINHO ◽  
ANA PAIVA ◽  
...  

Affect recognition for socially perceptive robots relies on representative data. While many of the existing affective corpora and databases contain posed and decontextualized affective expressions, affect resources for designing an affect recognition system in naturalistic human–robot interaction (HRI) must include context-rich expressions that emerge in the same scenario of the final application. In this paper, we propose a context-based approach to the collection and modeling of representative data for building an affect-sensitive robotic game companion. To illustrate our approach we present the key features of the Inter-ACT (INTEracting with Robots–Affect Context Task) corpus, an affective and contextually rich multimodal video corpus containing affective expressions of children playing chess with an iCat robot. We show how this corpus can be successfully used to train a context-sensitive affect recognition system (a valence detector) for a robotic game companion. Finally, we demonstrate how the integration of the affect recognition system in a modular platform for adaptive HRI makes the interaction with the robot more engaging.


2019 ◽  
Vol 16 (4) ◽  
pp. 172988141986176 ◽  
Author(s):  
Bo Chen ◽  
Chunsheng Hua ◽  
Bo Dai ◽  
Yuqing He ◽  
Jianda Han

This article proposes an online control programming algorithm for human–robot interaction systems, where robot actions are controlled by the recognition results of gestures performed by human operators based on visual images. In contrast to traditional robot control systems that use pre-defined programs to control a robot where the robot cannot change its tasks freely, this system allows the operator to train online and replan human–robot interaction tasks in real time. The proposed system is comprised of three components: an online personal feature pretraining system, a gesture recognition system, and a task replanning system for robot control. First, we collected and analyzed features extracted from images of human gestures and used those features to train the recognition program in real time. Second, a multifeature cascade classifier algorithm was applied to guarantee both the accuracy and real-time processing of our gesture recognition method. Finally, to confirm the effectiveness of our algorithm, we selected a flight robot as our test platform to conduct an online robot control experiment based on the visual gesture recognition algorithm. Through extensive experiments, the effectiveness and efficiency of our method has been confirmed.


Sign in / Sign up

Export Citation Format

Share Document