scholarly journals Turn-Taking Based on Information Flow for Fluent Human-Robot Interaction

AI Magazine ◽  
2011 ◽  
Vol 32 (4) ◽  
pp. 53-63 ◽  
Author(s):  
Andrea L. Thomaz ◽  
Crystal Chao

Turn-taking is a fundamental part of human communication. Our goal is to devise a turn-taking framework for human-robot interaction that, like the human skill, represents something fundamental about interaction, generic to context or domain. We propose a model of turn-taking, and conduct an experiment with human subjects to inform this model. Our findings from this study suggest that information flow is an integral part of human floor-passing behavior. Following this, we implement autonomous floor relinquishing on a robot and discuss our insights into the nature of a general turn-taking model for human-robot interaction.

2018 ◽  
Vol 9 (1) ◽  
pp. 221-234 ◽  
Author(s):  
João Avelino ◽  
Tiago Paulino ◽  
Carlos Cardoso ◽  
Ricardo Nunes ◽  
Plinio Moreno ◽  
...  

Abstract Handshaking is a fundamental part of human physical interaction that is transversal to various cultural backgrounds. It is also a very challenging task in the field of Physical Human-Robot Interaction (pHRI), requiring compliant force control in order to plan the arm’s motion and for a confident, but at the same time pleasant grasp of the human user’s hand. In this paper,we focus on the study of the hand grip strength for comfortable handshakes and perform three sets of physical interaction experiments between twenty human subjects in the first experiment, thirty-five human subjects in the second one, and thirty-eight human subjects in the third one. Tests are made with a social robot whose hands are instrumented with tactile sensors that provide skin-like sensation. From these experiments, we: (i) learn the preferred grip closure according to each user group; (ii) analyze the tactile feedback provided by the sensors for each closure; (iii) develop and evaluate the hand grip controller based on previous data. In addition to the robot-human interactions, we also learn about the robot executed handshake interactions with inanimate objects, in order to detect if it is shaking hands with a human or an inanimate object. This work adds physical human-robot interaction to the repertory of social skills of our robot, fulfilling a demand previously identified by many users of the robot.


2021 ◽  
Vol 10 (3) ◽  
pp. 1-25
Author(s):  
Ajung Moon ◽  
Maneezhay Hashmi ◽  
H. F. Machiel Van Der Loos ◽  
Elizabeth A. Croft ◽  
Aude Billard

When the question of who should get access to a communal resource first is uncertain, people often negotiate via nonverbal communication to resolve the conflict. What should a robot be programmed to do when such conflicts arise in Human-Robot Interaction? The answer to this question varies depending on the context of the situation. Learning from how humans use hesitation gestures to negotiate a solution in such conflict situations, we present a human-inspired design of nonverbal hesitation gestures that can be used for Human-Robot Negotiation. We extracted characteristic features of such negotiative hesitations humans use, and subsequently designed a trajectory generator (Negotiative Hesitation Generator) that can re-create the features in robot responses to conflicts. Our human-subjects experiment demonstrates the efficacy of the designed robot behaviour against non-negotiative stopping behaviour of a robot. With positive results from our human-robot interaction experiment, we provide a validated trajectory generator with which one can explore the dynamics of human-robot nonverbal negotiation of resource conflicts.


2020 ◽  
Vol 10 (22) ◽  
pp. 7992
Author(s):  
Jinseok Woo ◽  
Yasuhiro Ohyama ◽  
Naoyuki Kubota

This paper presents a robot partner development platform based on smart devices. Humans communicate with others based on the basic motivations of human cooperation and have communicative motives based on social attributes. Understanding and applying these communicative motives become important in the development of socially-embedded robot partners. Therefore, it is becoming more important to develop robots that can be applied according to needs while taking these human communication elements into consideration. The role of a robot partner is more important in not only on the industrial sector but also in households. However, it seems that it will take time to disseminate robots. In the field of service robots, the development of robots according to various needs is important and the system integration of hardware and software becomes crucial. Therefore, in this paper, we propose a robot partner development platform for human-robot interaction. Firstly, we propose a modularized architecture of robot partners using a smart device to realize a flexible update based on the re-usability of hardware and software modules. In addition, we show examples of implementing a robot system using the proposed architecture. Next, we focus on the development of various robots using the modular robot partner system. Finally, we discuss the effectiveness of the proposed robot partner system through social implementation and experiments.


Author(s):  
Wei Quan ◽  
Jinseok Woo ◽  
Yuichiro Toda ◽  
Naoyuki Kubota ◽  
◽  
...  

Human posture recognition has been a popular research topic since the development of the referent fields of human-robot interaction, and simulation operation. Most of these methods are based on supervised learning, and a large amount of training information is required to conduct an ideal assessment. In this study, we propose a solution to this by applying a number of unsupervised learning algorithms based on the forward kinematics model of the human skeleton. Next, we optimize the proposed method by integrating particle swarm optimization (PSO) for optimization. The advantage of the proposed method is no pre-training data is that required for human posture generation and recognition. We validate the method by conducting a series of experiments with human subjects.


2022 ◽  
Vol 8 ◽  
Author(s):  
Niyati Rawal ◽  
Dorothea Koert ◽  
Cigdem Turan ◽  
Kristian Kersting ◽  
Jan Peters ◽  
...  

The ability of a robot to generate appropriate facial expressions is a key aspect of perceived sociability in human-robot interaction. Yet many existing approaches rely on the use of a set of fixed, preprogrammed joint configurations for expression generation. Automating this process provides potential advantages to scale better to different robot types and various expressions. To this end, we introduce ExGenNet, a novel deep generative approach for facial expressions on humanoid robots. ExGenNets connect a generator network to reconstruct simplified facial images from robot joint configurations with a classifier network for state-of-the-art facial expression recognition. The robots’ joint configurations are optimized for various expressions by backpropagating the loss between the predicted expression and intended expression through the classification network and the generator network. To improve the transfer between human training images and images of different robots, we propose to use extracted features in the classifier as well as in the generator network. Unlike most studies on facial expression generation, ExGenNets can produce multiple configurations for each facial expression and be transferred between robots. Experimental evaluations on two robots with highly human-like faces, Alfie (Furhat Robot) and the android robot Elenoide, show that ExGenNet can successfully generate sets of joint configurations for predefined facial expressions on both robots. This ability of ExGenNet to generate realistic facial expressions was further validated in a pilot study where the majority of human subjects could accurately recognize most of the generated facial expressions on both the robots.


Author(s):  
Gabriele Trovato ◽  
Massimiliano Zecca ◽  
Salvatore Sessa ◽  
Lorenzo Jamone ◽  
Jaap Ham ◽  
...  

AbstractAs witnessed in several behavioural studies, a complex relationship exists between people’s cultural background and their general acceptance towards robots. However, very few studies have investigated whether a robot’s original language and gesture based on certain culture have an impact on the people of the different cultures. The purpose of this work is to provide experimental evidence which supports the idea that humans may accept more easily a robot that can adapt to their specific culture. Indeed, improving acceptance and reducing discomfort is fundamental for future deployment of robots as assistive, health-care or companion devices into a society. We conducted a Human- Robot Interaction experiment both in Egypt and in Japan. Human subjects were engaged in a simulated video conference with robots that were greeting and speaking either in Arabic or in Japanese. The subjects completed a questionnaire assessing their preferences and their emotional state, while their spontaneous reactions were recorded in different ways. The results suggest that Egyptians prefer the Arabic robot, while they feel a sense of discomfort when interacting with the Japanese robot; the opposite is also true for the Japanese. These findings confirm the importance of the localisation of a robot in order to improve human acceptance during social human-robot interaction.


2017 ◽  
Vol 02 (03) ◽  
pp. 1740008 ◽  
Author(s):  
Ioannis Georgilas ◽  
Giulio Dagnino ◽  
Sanja Dogramadzi

This paper presents a safety analysis of a Robotic Fracture Surgery System using the Systems-Theoretic Process Analysis (STPA). It focuses particularly on hazards caused by the human in the loop. The robotic system and operating staff are modeled including information flow between different components of the system. The analysis has generated a set of requirements for the system design that can ultimately mitigate the identified hazards, as well as a preliminary set of human factors that can improve safety.


Machines ◽  
2021 ◽  
Vol 10 (1) ◽  
pp. 15
Author(s):  
Akiyoshi Hayashi ◽  
Liz Katherine Rincon-Ardila ◽  
Gentiane Venture

In the future, in a society where robots and humans live together, HRI is an important field of research. While most human–robot-interaction (HRI) studies focus on appearance and dialogue, touch-communication has not been the focus of many studies despite the importance of its role in human–human communication. This paper investigates how and where humans touch an inorganic non-zoomorphic robot arm. Based on these results, we install touch sensors on the robot arm and conduct experiments to collect data of users’ impressions towards the robot when touching it. Our results suggest two main things. First, the touch gestures were collected with two sensors, and the collected data can be analyzed using machine learning to classify the gestures. Second, communication between humans and robots using touch can improve the user’s impression of the robots.


Sign in / Sign up

Export Citation Format

Share Document