scholarly journals Tracking Human Interactions with a Commercially-available Robot over Multiple Days: A Tutorial

2020 ◽  
Author(s):  
Bishakha Chaudhury ◽  
Ruud Hortensius ◽  
Martin Hoffmann ◽  
Emily S. Cross

As research examining human-robot interaction moves from the laboratory to the real world, studies seeking to examine how people interact with robots face the question of which robotic platform to employ to collect data in situ. To facilitate the study of a broad range of individuals, from children to clinical populations, across diverse environments, from homes to schools, a robust, reproducible, low-cost and easy-to-use robotic platform is needed. Here, we describe how a commercially available off-the-shelf robot, Cozmo, can be used to study embodied human-robot interactions in a wide variety of settings, including the user’s home. In this Tutorial, we describe the steps required to use this affordable and flexible platform for longitudinal human-robot interaction studies. First, we outline the technical specifications and requirements of this platform and accessories. We present findings from validation work we performed to map the behavioural repertoire of the Cozmo robot and introduce an accompanying interactive emotion classification tool to use with this robot. We then show how log files containing detailed data on the human-robot interaction can be collected and extracted. Finally, we detail the types of information that can be retrieved from these data. This low-cost robotic platform will provide the field with a variety of valuable new possibilities to study human-robot interactions within and beyond the research laboratory, which are user-driven and unconstrained in both time and place.

2018 ◽  
Vol 9 (1) ◽  
pp. 221-234 ◽  
Author(s):  
João Avelino ◽  
Tiago Paulino ◽  
Carlos Cardoso ◽  
Ricardo Nunes ◽  
Plinio Moreno ◽  
...  

Abstract Handshaking is a fundamental part of human physical interaction that is transversal to various cultural backgrounds. It is also a very challenging task in the field of Physical Human-Robot Interaction (pHRI), requiring compliant force control in order to plan the arm’s motion and for a confident, but at the same time pleasant grasp of the human user’s hand. In this paper,we focus on the study of the hand grip strength for comfortable handshakes and perform three sets of physical interaction experiments between twenty human subjects in the first experiment, thirty-five human subjects in the second one, and thirty-eight human subjects in the third one. Tests are made with a social robot whose hands are instrumented with tactile sensors that provide skin-like sensation. From these experiments, we: (i) learn the preferred grip closure according to each user group; (ii) analyze the tactile feedback provided by the sensors for each closure; (iii) develop and evaluate the hand grip controller based on previous data. In addition to the robot-human interactions, we also learn about the robot executed handshake interactions with inanimate objects, in order to detect if it is shaking hands with a human or an inanimate object. This work adds physical human-robot interaction to the repertory of social skills of our robot, fulfilling a demand previously identified by many users of the robot.


Sensor Review ◽  
2015 ◽  
Vol 35 (3) ◽  
pp. 244-250 ◽  
Author(s):  
Pedro Neto ◽  
Nuno Mendes ◽  
A. Paulo Moreira

Purpose – The purpose of this paper is to achieve reliable estimation of yaw angles by fusing data from low-cost inertial and magnetic sensing. Design/methodology/approach – In this paper, yaw angle is estimated by fusing inertial and magnetic sensing from a digital compass and a gyroscope, respectively. A Kalman filter estimates the error produced by the gyroscope. Findings – Drift effect produced by the gyroscope is significantly reduced and, at the same time, the system has the ability to react quickly to orientation changes. The system combines the best of each sensor, the stability of the magnetic sensor and the fast response of the inertial sensor. Research limitations/implications – The system does not present a stable behavior in the presence of large vibrations. Considerable calibration efforts are needed. Practical implications – Today, most of human–robot interaction technologies need to have the ability to estimate orientation, especially yaw angle, from small-sized and low-cost sensors. Originality/value – Existing methods for inertial and magnetic sensor fusion are combined to achieve reliable estimation of yaw angle. Experimental tests in a human–robot interaction scenario show the performance of the system.


2021 ◽  
Vol 8 ◽  
Author(s):  
Hua Minh Tuan ◽  
Filippo Sanfilippo ◽  
Nguyen Vinh Hao

Collaborative robots (or cobots) are robots that can safely work together or interact with humans in a common space. They gradually become noticeable nowadays. Compliant actuators are very relevant for the design of cobots. This type of actuation scheme mitigates the damage caused by unexpected collision. Therefore, elastic joints are considered to outperform rigid joints when operating in a dynamic environment. However, most of the available elastic robots are relatively costly or difficult to construct. To give researchers a solution that is inexpensive, easily customisable, and fast to fabricate, a newly-designed low-cost, and open-source design of an elastic joint is presented in this work. Based on the newly design elastic joint, a highly-compliant multi-purpose 2-DOF robot arm for safe human-robot interaction is also introduced. The mechanical design of the robot and a position control algorithm are presented. The mechanical prototype is 3D-printed. The control algorithm is a two loops control scheme. In particular, the inner control loop is designed as a model reference adaptive controller (MRAC) to deal with uncertainties in the system parameters, while the outer control loop utilises a fuzzy proportional-integral controller to reduce the effect of external disturbances on the load. The control algorithm is first validated in simulation. Then the effectiveness of the controller is also proven by experiments on the mechanical prototype.


Electronics ◽  
2020 ◽  
Vol 9 (11) ◽  
pp. 1761
Author(s):  
Martina Szabóová ◽  
Martin Sarnovský ◽  
Viera Maslej Krešňáková ◽  
Kristína Machová

This paper connects two large research areas, namely sentiment analysis and human–robot interaction. Emotion analysis, as a subfield of sentiment analysis, explores text data and, based on the characteristics of the text and generally known emotional models, evaluates what emotion is presented in it. The analysis of emotions in the human–robot interaction aims to evaluate the emotional state of the human being and on this basis to decide how the robot should adapt its behavior to the human being. There are several approaches and algorithms to detect emotions in the text data. We decided to apply a combined method of dictionary approach with machine learning algorithms. As a result of the ambiguity and subjectivity of labeling emotions, it was possible to assign more than one emotion to a sentence; thus, we were dealing with a multi-label problem. Based on the overview of the problem, we performed experiments with the Naive Bayes, Support Vector Machine and Neural Network classifiers. Results obtained from classification were subsequently used in human–robot experiments. Despise the lower accuracy of emotion classification, we proved the importance of expressing emotion gestures based on the words we speak.


Author(s):  
Xiaoran Fan ◽  
Daewon Lee ◽  
Lawrence Jackel ◽  
Richard Howard ◽  
Daniel Lee ◽  
...  

Author(s):  
Sergio D. Sierra ◽  
Juan F. Molina ◽  
Daniel A. Gomez ◽  
Marcela C. Munera ◽  
Carlos A. Cifuentes

Robotica ◽  
2014 ◽  
Vol 33 (1) ◽  
pp. 1-18 ◽  
Author(s):  
Alberto Poncela ◽  
Leticia Gallardo-Estrella

SUMMARYVerbal communication is the most natural way of human–robot interaction. Such an interaction is usually achieved by means of a human-robot interface (HRI). In this paper, a HRI is presented to teleoperate a robotic platform via the user's voice. Hence, a speech recognition system is necessary. In this work, a user-dependent acoustic model for Spanish speakers has been developed to teleoperate a robot with a set of commands. Experimental results have been successful, both in terms of a high recognition rate and the navigation of the robot under the control of the user's voice.


Sign in / Sign up

Export Citation Format

Share Document