Handshake: Realistic Human-Robot Interaction in Haptic Enhanced Virtual Reality

2011 ◽  
Vol 20 (4) ◽  
pp. 371-392 ◽  
Author(s):  
Zheng Wang ◽  
Elias Giannopoulos ◽  
Mel Slater ◽  
Angelika Peer

This paper focuses on the development and evaluation of a haptic enhanced virtual reality system which allows a human user to make physical handshakes with a virtual partner through a haptic interface. Multimodal feedback signals are designed to generate the illusion that a handshake with a robotic arm is a handshake with another human. Advanced controllers of the haptic interface are developed to respond to user behaviors online. Techniques to achieve online behavior generation are presented, such as a hidden-Markov-model approach to human interaction strategy estimation. Human-robot handshake experiments were carried out to evaluate the performance of the system. Two different approaches to haptic rendering were compared in experiments: a controller in basic mode with an embedded curve in the robot that disregards the human partner, and an interactive robot controller for online behavior generation. The two approaches were compared with the ground truth of another human driving the robot via teleoperation instead of the controller implementing a virtual partner. In the evaluation results, the human approach is rated to be most human-like, with the interactive controller following closely behind, followed by the controller in basic mode. This paper mainly concentrates on discussing the development of the haptic rendering algorithm for the handshaking system, its integration with visual and haptic cues, and reports about the results of subjective evaluation experiments that were carried out.

Author(s):  
Ming C. Leu ◽  
Aditya Velivelli ◽  
Xiaobo Peng

This paper presents the development of a virtual sculpting system, with the goal of enabling the user to create a freeform model by carving a virtual workpiece with a virtual tool while providing haptic interface during the sculpting process. A virtual reality approach is taken to provide stereoscopic viewing and force feedback, thus making the process of model creation in the virtual environment easier and more intuitive. The development of this system involves integrating techniques and algorithms in geometric modeling, computer graphics, and haptic rendering. Multithreading is used in an attempt to address the different update rates required in the graphic and haptic displays.


2020 ◽  
Author(s):  
Agnieszka Wykowska ◽  
Jairo Pérez-Osorio ◽  
Stefan Kopp

This booklet is a collection of the position statements accepted for the HRI’20 conference workshop “Social Cognition for HRI: Exploring the relationship between mindreading and social attunement in human-robot interaction” (Wykowska, Perez-Osorio & Kopp, 2020). Unfortunately, due to the rapid unfolding of the novel coronavirus at the beginning of the present year, the conference and consequently our workshop, were canceled. On the light of these events, we decided to put together the positions statements accepted for the workshop. The contributions collected in these pages highlight the role of attribution of mental states to artificial agents in human-robot interaction, and precisely the quality and presence of social attunement mechanisms that are known to make human interaction smooth, efficient, and robust. These papers also accentuate the importance of the multidisciplinary approach to advance the understanding of the factors and the consequences of social interactions with artificial agents.


2021 ◽  
Author(s):  
Polona Caserman ◽  
Augusto Garcia-Agundez ◽  
Alvar Gámez Zerban ◽  
Stefan Göbel

AbstractCybersickness (CS) is a term used to refer to symptoms, such as nausea, headache, and dizziness that users experience during or after virtual reality immersion. Initially discovered in flight simulators, commercial virtual reality (VR) head-mounted displays (HMD) of the current generation also seem to cause CS, albeit in a different manner and severity. The goal of this work is to summarize recent literature on CS with modern HMDs, to determine the specificities and profile of immersive VR-caused CS, and to provide an outlook for future research areas. A systematic review was performed on the databases IEEE Xplore, PubMed, ACM, and Scopus from 2013 to 2019 and 49 publications were selected. A summarized text states how different VR HMDs impact CS, how the nature of movement in VR HMDs contributes to CS, and how we can use biosensors to detect CS. The results of the meta-analysis show that although current-generation VR HMDs cause significantly less CS ($$p<0.001$$ p < 0.001 ), some symptoms remain as intense. Further results show that the nature of movement and, in particular, sensory mismatch as well as perceived motion have been the leading cause of CS. We suggest an outlook on future research, including the use of galvanic skin response to evaluate CS in combination with the golden standard (Simulator Sickness Questionnaire, SSQ) as well as an update on the subjective evaluation scores of the SSQ.


2018 ◽  
Vol 9 (1) ◽  
pp. 168-182 ◽  
Author(s):  
Mina Marmpena ◽  
Angelica Lim ◽  
Torbjørn S. Dahl

Abstract Human-robot interaction in social robotics applications could be greatly enhanced by robotic behaviors that incorporate emotional body language. Using as our starting point a set of pre-designed, emotion conveying animations that have been created by professional animators for the Pepper robot, we seek to explore how humans perceive their affect content, and to increase their usability by annotating them with reliable labels of valence and arousal, in a continuous interval space. We conducted an experiment with 20 participants who were presented with the animations and rated them in the two-dimensional affect space. An inter-rater reliability analysis was applied to support the aggregation of the ratings for deriving the final labels. The set of emotional body language animations with the labels of valence and arousal is available and can potentially be useful to other researchers as a ground truth for behavioral experiments on robotic expression of emotion, or for the automatic selection of robotic emotional behaviors with respect to valence and arousal. To further utilize the data we collected, we analyzed it with an exploratory approach and we present some interesting trends with regard to the human perception of Pepper’s emotional body language, that might be worth further investigation.


2019 ◽  
Vol 24 (4) ◽  
pp. 542-549 ◽  
Author(s):  
Arif Pramudwiatmoko ◽  
Satoru Tsutoh ◽  
Gregory Gutmann ◽  
Yutaka Ueno ◽  
Akihiko Konagaya

Sign in / Sign up

Export Citation Format

Share Document