scholarly journals Multimodal social scenario perception model for initial human-robot interaction

Author(s):  
Diego Cardoso Alves ◽  
Paula Dornhofer Paro Costa

Human-robot interaction imposes many challenges and artificial intelligence researchers are demanded to improve scene perception, social navigation and engagement. Great attention is being dedicated to the development of computer vision and multimodal sensing approaches that are focused on the evolution of social robotic systems and the improvement of social model accuracy. Most recent works related to social robotics rely on the engagement process with a focus on maintaining a previously established conversation. This work brings up the study of initial human-robot interaction contexts, proposing a system that is able to analyze a social scenario through the detection and analysis of persons and surrounding features in a scene. RGB and depth frames, as well as audio data, were used in order to achieve better performance in indoor scene monitoring and human behavior analysis.

Author(s):  
Mauro Dragone ◽  
Joe Saunders ◽  
Kerstin Dautenhahn

AbstractEnabling robots to seamlessly operate as part of smart spaces is an important and extended challenge for robotics R&D and a key enabler for a range of advanced robotic applications, such as AmbientAssisted Living (AAL) and home automation. The integration of these technologies is currently being pursued from two largely distinct view-points: On the one hand, people-centred initiatives focus on improving the user’s acceptance by tackling human-robot interaction (HRI) issues, often adopting a social robotic approach, and by giving to the designer and - in a limited degree – to the final user(s), control on personalization and product customisation features. On the other hand, technologically-driven initiatives are building impersonal but intelligent systems that are able to pro-actively and autonomously adapt their operations to fit changing requirements and evolving users’ needs, but which largely ignore and do not leverage human-robot interaction and may thus lead to poor user experience and user acceptance. In order to inform the development of a new generation of smart robotic spaces, this paper analyses and compares different research strands with a view to proposing possible integrated solutions with both advanced HRI and online adaptation capabilities.


Author(s):  
WenDong Wang ◽  
JunBo Zhang ◽  
Xin Wang ◽  
XiaoQing Yuan ◽  
Peng Zhang

AbstractThe motion intensity of patient is significant for the trajectory control of exoskeleton robot during rehabilitation, as it may have important influence on training effect and human–robot interaction. To design rehabilitation training task according to situation of patients, a novel control method of rehabilitation exoskeleton robot is designed based on motion intensity perception model. The motion signal of robot and the heart rate signal of patient are collected and fused into multi-modal information as the input layer vector of deep learning framework, which is used for the human–robot interaction model of control system. A 6-degree of freedom (DOF) upper limb rehabilitation exoskeleton robot is designed previously to implement the test. The parameters of the model are iteratively optimized by grouping the experimental data, and identification effect of the model is analyzed and compared. The average recognition accuracy of the proposed model can reach up to 99.0% in the training data set and 95.7% in the test data set, respectively. The experimental results show that the proposed motion intensity perception model based on deep neural network (DNN) and the trajectory control method can improve the performance of human–robot interaction, and it is possible to further improve the effect of rehabilitation training.


2009 ◽  
Author(s):  
Matthew S. Prewett ◽  
Kristin N. Saboe ◽  
Ryan C. Johnson ◽  
Michael D. Coovert ◽  
Linda R. Elliott

2010 ◽  
Author(s):  
Eleanore Edson ◽  
Judith Lytle ◽  
Thomas McKenna

Sign in / Sign up

Export Citation Format

Share Document