Comparison of Human-Robot Interaction Torque Estimation Methods in a Wrist Rehabilitation Exoskeleton

2018 ◽  
Vol 94 (3-4) ◽  
pp. 565-581 ◽  
Author(s):  
Mohammadhossein Saadatzi ◽  
David C. Long ◽  
Ozkan Celik
Sensors ◽  
2019 ◽  
Vol 19 (14) ◽  
pp. 3142 ◽  
Author(s):  
Sai Krishna Pathi ◽  
Andrey Kiselev ◽  
Annica Kristoffersson ◽  
Dirk Repsilber ◽  
Amy Loutfi

Estimating distances between people and robots plays a crucial role in understanding social Human–Robot Interaction (HRI) from an egocentric view. It is a key step if robots should engage in social interactions, and to collaborate with people as part of human–robot teams. For distance estimation between a person and a robot, different sensors can be employed, and the number of challenges to be addressed by the distance estimation methods rise with the simplicity of the technology of a sensor. In the case of estimating distances using individual images from a single camera in a egocentric position, it is often required that individuals in the scene are facing the camera, do not occlude each other, and are fairly visible so specific facial or body features can be identified. In this paper, we propose a novel method for estimating distances between a robot and people using single images from a single egocentric camera. The method is based on previously proven 2D pose estimation, which allows partial occlusions, cluttered background, and relatively low resolution. The method estimates distance with respect to the camera based on the Euclidean distance between ear and torso of people in the image plane. Ear and torso characteristic points has been selected based on their relatively high visibility regardless of a person orientation and a certain degree of uniformity with regard to the age and gender. Experimental validation demonstrates effectiveness of the proposed method.


2009 ◽  
Author(s):  
Matthew S. Prewett ◽  
Kristin N. Saboe ◽  
Ryan C. Johnson ◽  
Michael D. Coovert ◽  
Linda R. Elliott

2010 ◽  
Author(s):  
Eleanore Edson ◽  
Judith Lytle ◽  
Thomas McKenna

2020 ◽  
Author(s):  
Agnieszka Wykowska ◽  
Jairo Pérez-Osorio ◽  
Stefan Kopp

This booklet is a collection of the position statements accepted for the HRI’20 conference workshop “Social Cognition for HRI: Exploring the relationship between mindreading and social attunement in human-robot interaction” (Wykowska, Perez-Osorio & Kopp, 2020). Unfortunately, due to the rapid unfolding of the novel coronavirus at the beginning of the present year, the conference and consequently our workshop, were canceled. On the light of these events, we decided to put together the positions statements accepted for the workshop. The contributions collected in these pages highlight the role of attribution of mental states to artificial agents in human-robot interaction, and precisely the quality and presence of social attunement mechanisms that are known to make human interaction smooth, efficient, and robust. These papers also accentuate the importance of the multidisciplinary approach to advance the understanding of the factors and the consequences of social interactions with artificial agents.


2019 ◽  
Author(s):  
Cinzia Di Dio ◽  
Federico Manzi ◽  
Giulia Peretti ◽  
Angelo Cangelosi ◽  
Paul L. Harris ◽  
...  

Studying trust within human-robot interaction is of great importance given the social relevance of robotic agents in a variety of contexts. We investigated the acquisition, loss and restoration of trust when preschool and school-age children played with either a human or a humanoid robot in-vivo. The relationship between trust and the quality of attachment relationships, Theory of Mind, and executive function skills was also investigated. No differences were found in children’s trust in the play-partner as a function of agency (human or robot). Nevertheless, 3-years-olds showed a trend toward trusting the human more than the robot, while 7-years-olds displayed the reverse behavioral pattern, thus highlighting the developing interplay between affective and cognitive correlates of trust.


Sign in / Sign up

Export Citation Format

Share Document