scholarly journals Symmetric Evaluation of Multimodal Human–Robot Interaction with Gaze and Standard Control

Symmetry ◽  
2018 ◽  
Vol 10 (12) ◽  
pp. 680
Author(s):  
Ethan Jones ◽  
Winyu Chinthammit ◽  
Weidong Huang ◽  
Ulrich Engelke ◽  
Christopher Lueg

Control of robot arms is often required in engineering and can be performed by using different methods. This study examined and symmetrically compared the use of a controller, eye gaze tracker and a combination thereof in a multimodal setup for control of a robot arm. Tasks of different complexities were defined and twenty participants completed an experiment using these interaction modalities to solve the tasks. More specifically, there were three tasks: the first was to navigate a chess piece from a square to another pre-specified square; the second was the same as the first task, but required more moves to complete; and the third task was to move multiple pieces to reach a solution to a pre-defined arrangement of the pieces. Further, while gaze control has the potential to be more intuitive than a hand controller, it suffers from limitations with regard to spatial accuracy and target selection. The multimodal setup aimed to mitigate the weaknesses of the eye gaze tracker, creating a superior system without simply relying on the controller. The experiment shows that the multimodal setup improves performance over the eye gaze tracker alone ( p < 0.05 ) and was competitive with the controller only setup, although did not outperform it ( p > 0.05 ).

2013 ◽  
Vol 14 (3) ◽  
pp. 390-418 ◽  
Author(s):  
Tian Xu ◽  
Hui Zhang ◽  
Chen Yu

When humans are addressing multiple robots with informative speech acts (Clark & Carlson 1982), their cognitive resources are shared between all the participating robot agents. For each moment, the user’s behavior is not only determined by the actions of the robot that they are directly gazing at, but also shaped by the behaviors from all the other robots in the shared environment. We define cooperative behavior as the action performed by the robots that are not capturing the user’s direct attention. In this paper, we are interested in how the human participants adjust and coordinate their own behavioral cues when the robot agents are performing different cooperative gaze behaviors. A novel gaze-contingent platform was designed and implemented. The robots’ behaviors were triggered by the participant’s attentional shifts in real time. Results showed that the human participants were highly sensitive when the robot agents were performing different cooperative gazing behaviors. Keywords: human-robot interaction; multi-robot interaction; multiparty interaction; eye gaze cue; embodied conversational agent


2021 ◽  
Vol 8 ◽  
Author(s):  
Hua Minh Tuan ◽  
Filippo Sanfilippo ◽  
Nguyen Vinh Hao

Collaborative robots (or cobots) are robots that can safely work together or interact with humans in a common space. They gradually become noticeable nowadays. Compliant actuators are very relevant for the design of cobots. This type of actuation scheme mitigates the damage caused by unexpected collision. Therefore, elastic joints are considered to outperform rigid joints when operating in a dynamic environment. However, most of the available elastic robots are relatively costly or difficult to construct. To give researchers a solution that is inexpensive, easily customisable, and fast to fabricate, a newly-designed low-cost, and open-source design of an elastic joint is presented in this work. Based on the newly design elastic joint, a highly-compliant multi-purpose 2-DOF robot arm for safe human-robot interaction is also introduced. The mechanical design of the robot and a position control algorithm are presented. The mechanical prototype is 3D-printed. The control algorithm is a two loops control scheme. In particular, the inner control loop is designed as a model reference adaptive controller (MRAC) to deal with uncertainties in the system parameters, while the outer control loop utilises a fuzzy proportional-integral controller to reduce the effect of external disturbances on the load. The control algorithm is first validated in simulation. Then the effectiveness of the controller is also proven by experiments on the mechanical prototype.


2017 ◽  
Vol 6 (1) ◽  
pp. 25 ◽  
Author(s):  
Henny Admoni ◽  
Brian Scassellati

2021 ◽  
Vol 12 (1) ◽  
pp. 379-391
Author(s):  
Matthew Story ◽  
Cyril Jaksic ◽  
Sarah R. Fletcher ◽  
Philip Webb ◽  
Gilbert Tang ◽  
...  

Abstract Although the principles followed by modern standards for interaction between humans and robots follow the First Law of Robotics popularized in science fiction in the 1960s, the current standards regulating the interaction between humans and robots emphasize the importance of physical safety. However, they are less developed in another key dimension: psychological safety. As sales of industrial robots have been increasing over recent years, so has the frequency of human–robot interaction (HRI). The present article looks at the current safety guidelines for HRI in an industrial setting and assesses their suitability. This article then presents a means to improve current standards utilizing lessons learned from studies into human aware navigation (HAN), which has seen increasing use in mobile robotics. This article highlights limitations in current research, where the relationships established in mobile robotics have not been carried over to industrial robot arms. To understand this, it is necessary to focus less on how a robot arm avoids humans and more on how humans react when a robot is within the same space. Currently, the safety guidelines are behind the technological advance, however, with further studies aimed at understanding HRI and applying it to newly developed path finding and obstacle avoidance methods, science fiction can become science fact.


2019 ◽  
Vol 8 (1) ◽  
pp. 34-44
Author(s):  
Maike Klein

Within both popular media and (some) scientific contexts, affective and ‘emotional’ machines are assumed to already exist. The aim of this paper is to draw attention to some of the key conceptual and theoretical issues raised by the ostensible affectivity. My investigation starts with three robotic encounters: a robot arm, the first (according to media) ‘emotional’ robot, Pepper, and Mako, a robotic cat. To make sense of affectivity in these encounters, I discuss emotion theoretical implications for affectivity in human-machine-interaction. Which theories have been implemented in the creation of the encountered robots? Being aware that in any given robot, there is no strict implementation of one single emotion theory, I will focus on two commonly used emotion theories: Russell and Mehrabian’s Three-Factor Theory of Emotion (the computational models derived from that theory are known as PAD models) and Ekman’s Basic Emotion Theory. An alternative way to approach affectivity in artificial systems is the Relational Approach of Damiano et al. which emphasizes human-robot-interaction in social robotics. In considering this alternative I also raise questions about the possibility of affectivity in robot-robot-relations.


Machines ◽  
2021 ◽  
Vol 10 (1) ◽  
pp. 15
Author(s):  
Akiyoshi Hayashi ◽  
Liz Katherine Rincon-Ardila ◽  
Gentiane Venture

In the future, in a society where robots and humans live together, HRI is an important field of research. While most human–robot-interaction (HRI) studies focus on appearance and dialogue, touch-communication has not been the focus of many studies despite the importance of its role in human–human communication. This paper investigates how and where humans touch an inorganic non-zoomorphic robot arm. Based on these results, we install touch sensors on the robot arm and conduct experiments to collect data of users’ impressions towards the robot when touching it. Our results suggest two main things. First, the touch gestures were collected with two sensors, and the collected data can be analyzed using machine learning to classify the gestures. Second, communication between humans and robots using touch can improve the user’s impression of the robots.


Sign in / Sign up

Export Citation Format

Share Document