Body Language in Affective Human-Robot Interaction

Author(s):  
Darja Stoeva ◽  
Margrit Gelautz
2018 ◽  
Vol 9 (1) ◽  
pp. 168-182 ◽  
Author(s):  
Mina Marmpena ◽  
Angelica Lim ◽  
Torbjørn S. Dahl

Abstract Human-robot interaction in social robotics applications could be greatly enhanced by robotic behaviors that incorporate emotional body language. Using as our starting point a set of pre-designed, emotion conveying animations that have been created by professional animators for the Pepper robot, we seek to explore how humans perceive their affect content, and to increase their usability by annotating them with reliable labels of valence and arousal, in a continuous interval space. We conducted an experiment with 20 participants who were presented with the animations and rated them in the two-dimensional affect space. An inter-rater reliability analysis was applied to support the aggregation of the ratings for deriving the final labels. The set of emotional body language animations with the labels of valence and arousal is available and can potentially be useful to other researchers as a ground truth for behavioral experiments on robotic expression of emotion, or for the automatic selection of robotic emotional behaviors with respect to valence and arousal. To further utilize the data we collected, we analyzed it with an exploratory approach and we present some interesting trends with regard to the human perception of Pepper’s emotional body language, that might be worth further investigation.


2012 ◽  
Vol 605-607 ◽  
pp. 1656-1660
Author(s):  
Temsiri Sapsaman ◽  
Teerawat Benjawilaikul

To enhance the human-robot interaction social robots have been developed with focuses on facial expression and verbal language. However, little to none has been done on emotional expression through robot’s body language. This work uses a parameterization method with human’s emotion theory and experiments to find robot parameters for expressing emotions through body language. Mapping is done on 2- and 3-dimensional emotion space, and obtained coefficients can be used to determine the influence level of motion parameters to emotion domains.


2015 ◽  
Vol 29 (6) ◽  
pp. 1216-1248 ◽  
Author(s):  
Junchao Xu ◽  
Joost Broekens ◽  
Koen Hindriks ◽  
Mark A. Neerincx

2009 ◽  
Author(s):  
Matthew S. Prewett ◽  
Kristin N. Saboe ◽  
Ryan C. Johnson ◽  
Michael D. Coovert ◽  
Linda R. Elliott

2010 ◽  
Author(s):  
Eleanore Edson ◽  
Judith Lytle ◽  
Thomas McKenna

2020 ◽  
Author(s):  
Agnieszka Wykowska ◽  
Jairo Pérez-Osorio ◽  
Stefan Kopp

This booklet is a collection of the position statements accepted for the HRI’20 conference workshop “Social Cognition for HRI: Exploring the relationship between mindreading and social attunement in human-robot interaction” (Wykowska, Perez-Osorio & Kopp, 2020). Unfortunately, due to the rapid unfolding of the novel coronavirus at the beginning of the present year, the conference and consequently our workshop, were canceled. On the light of these events, we decided to put together the positions statements accepted for the workshop. The contributions collected in these pages highlight the role of attribution of mental states to artificial agents in human-robot interaction, and precisely the quality and presence of social attunement mechanisms that are known to make human interaction smooth, efficient, and robust. These papers also accentuate the importance of the multidisciplinary approach to advance the understanding of the factors and the consequences of social interactions with artificial agents.


Sign in / Sign up

Export Citation Format

Share Document