scholarly journals Evaluation of Humanoid Robot Design Based on Global Eye-Tracking Metrics

Author(s):  
Fan Li ◽  
Danni Chang ◽  
Yisi Liu ◽  
Jian Cui ◽  
Shanshan Feng ◽  
...  

The first impression of robot appearance normally affects the interaction with physical robots. Hence, it is critically important to evaluate the humanoid robot appearance design. This study towards evaluating humanoid robot design based on global eye-tracking metrics. Two methods are selected to extract global eye-tracking metrics, including bin-analysis-based entropy and approximate entropy. The data are collected from an eye-tracking experiment, where 20 participants evaluate 12 humanoid robot appearance designs with their eye movements recorded. The humanoid robots are evaluated from five aspects, namely smartness, friendliness, pleasure, arousal, and dominance. The results show that the entropy of fixation duration and velocity, approximate entropy of saccades amplitude are positively associated with the subjective feelings induced by robot appearance. These findings can aid in better understanding the first impression of human-robot interaction and enable the eye-tracking-based evaluation of humanoid robot design. By combining the theory of design and bio-signals analysis, the study contributes to the field of Transdisciplinary Engineering.

Author(s):  
Giorgio Metta

This chapter outlines a number of research lines that, starting from the observation of nature, attempt to mimic human behavior in humanoid robots. Humanoid robotics is one of the most exciting proving grounds for the development of biologically inspired hardware and software—machines that try to recreate billions of years of evolution with some of the abilities and characteristics of living beings. Humanoids could be especially useful for their ability to “live” in human-populated environments, occupying the same physical space as people and using tools that have been designed for people. Natural human–robot interaction is also an important facet of humanoid research. Finally, learning and adapting from experience, the hallmark of human intelligence, may require some approximation to the human body in order to attain similar capacities to humans. This chapter focuses particularly on compliant actuation, soft robotics, biomimetic robot vision, robot touch, and brain-inspired motor control in the context of the iCub humanoid robot.


2020 ◽  
Vol 12 (1) ◽  
pp. 58-73
Author(s):  
Sofia Thunberg ◽  
Tom Ziemke

AbstractInteraction between humans and robots will benefit if people have at least a rough mental model of what a robot knows about the world and what it plans to do. But how do we design human-robot interactions to facilitate this? Previous research has shown that one can change people’s mental models of robots by manipulating the robots’ physical appearance. However, this has mostly not been done in a user-centred way, i.e. without a focus on what users need and want. Starting from theories of how humans form and adapt mental models of others, we investigated how the participatory design method, PICTIVE, can be used to generate design ideas about how a humanoid robot could communicate. Five participants went through three phases based on eight scenarios from the state-of-the-art tasks in the RoboCup@Home social robotics competition. The results indicate that participatory design can be a suitable method to generate design concepts for robots’ communication in human-robot interaction.


2020 ◽  
Vol 17 (05) ◽  
pp. 2050021
Author(s):  
Grzegorz Ficht ◽  
Hafez Farazi ◽  
Diego Rodriguez ◽  
Dmytro Pavlichenko ◽  
Philipp Allgeuer ◽  
...  

For several years, high development and production costs of humanoid robots restricted researchers interested in working in the field. To overcome this problem, several research groups have opted to work with simulated or smaller robots, whose acquisition costs are significantly lower. However, due to scale differences and imperfect simulation replicability, results may not be directly reproducible on real, adult-sized robots. In this paper, we present the NimbRo-OP2X, a capable and affordable adult-sized humanoid platform aiming to significantly lower the entry barrier for humanoid robot research. With a height of 135[Formula: see text]cm and weight of only 19[Formula: see text]kg, the robot can interact in an unmodified, human environment without special safety equipment. Modularity in hardware and software allows this platform enough flexibility to operate in different scenarios and applications with minimal effort. The robot is equipped with an on-board computer with GPU, which enables the implementation of state-of-the-art approaches for object detection and human perception demanded by areas such as manipulation and human–robot interaction. Finally, the capabilities of the NimbRo-OP2X, especially in terms of locomotion stability and visual perception, are evaluated. This includes the performance at RoboCup 2018, where NimbRo-OP2X won all possible awards in the AdultSize class.


2019 ◽  
Vol 10 (1) ◽  
pp. 20-33
Author(s):  
Catelyn Scholl ◽  
Susan McRoy

Gestures that co-occur with speech are a fundamental component of communication. Prior research with children suggests that gestures may help them to resolve certain forms of lexical ambiguity, including homophones. To test this idea in the context of human-robot interaction, the effects of iconic and deictic gestures on the understanding of homophones was assessed in an experiment where a humanoid robot told a short story containing pairs of homophones to small groups of young participants, accompanied by either expressive gestures or no gestures. Both groups of subjects completed a pretest and post-test to measure their ability to discriminate between pairs of homophones and we calculated aggregated precision. The results show that the use of iconic and deictic gestures aids in general understanding of homophones, providing additional evidence for the importance of gesture to the development of children’s language and communication skills.


2018 ◽  
Vol 161 ◽  
pp. 01001 ◽  
Author(s):  
Karsten Berns ◽  
Zuhair Zafar

Human-machine interaction is a major challenge in the development of complex humanoid robots. In addition to verbal communication the use of non-verbal cues such as hand, arm and body gestures or mimics can improve the understanding of the intention of the robot. On the other hand, by perceiving such mechanisms of a human in a typical interaction scenario the humanoid robot can adapt its interaction skills in a better way. In this work, the perception system of two social robots, ROMAN and ROBIN of the RRLAB of the TU Kaiserslautern, is presented in the range of human-robot interaction.


2014 ◽  
Vol 11 (01) ◽  
pp. 1450003 ◽  
Author(s):  
Hatice Kose ◽  
Neziha Akalin ◽  
Pinar Uluer

This paper investigates the role of interaction and communication kinesics in human–robot interaction. This study is part of a novel research project on sign language (SL) tutoring through interaction games with humanoid robots. The main goal is to motivate the children with communication problems to understand and imitate the signs implemented by the robot using basic upper torso gestures and sound. We present an empirical and exploratory study investigating the effect of basic nonverbal gestures consisting of hand movements, body and face gestures expressed by a humanoid robot, and having comprehended the word, the participants will give relevant feedback in SL. This way the participant is both a passive observer and an active imitator throughout the learning process in different phases of the game. A five-fingered R3 robot platform and a three-fingered Nao H-25 robot are employed within the games. Vision-, sound-, touch- and motion-based cues are used for multimodal communication between the robot, child and therapist/parent within the study. This paper presents the preliminary results of the proposed game tested with adult participants. The aim is to evaluate the SL learning ability of participants from a robot, and compare different robot platforms within this setup.


Author(s):  
Marie D. Manner

We describe experiments performed with a large number of preschool children (ages 1.5 to 4 years) in a two-task eye tracking experiment and a human-robot interaction experiment. The resulting data of mostly neuro-typical children forms a baseline with which to compare children with autism, allowing us to further characterize the autism phenotype. Eye tracking task results indicate a strong preference for a humanoid robot and a social being (a four year old girl) over other robot types. Results from the human-robot interaction task, a semi-structured play interaction between child and robot, showed we can cluster participants based on social distances and other social responsiveness metrics.


Author(s):  
Yuan Wei ◽  
Jing Zhao

Purpose This paper aims to deal with the problem of designing robot behaviors (mainly to robotic arms) to express emotions. The authors study the effects of robot behaviors from our humanoid robot NAO on the subject’s emotion expression in human–robot interaction (HRI). Design/methodology/approach A method to design robot behavior through the movement primitives is proposed. Then, a novel dimensional affective model is built. Finally, the concept of action semantics is adopted to combine the robot behaviors with emotion expression. Findings For the evaluation of this combination, the authors assess positive (excited and happy) and negative (frightened and sad) emotional patterns on 20 subjects which are divided into two groups (whether they were familiar with robots). The results show that the recognition of the different emotion patterns does not have differences between the two groups and the subjects could recognize the robot behaviors with emotions. Practical implications Using affective models to guide robots’ behavior or express their intentions is highly beneficial in human–robot interaction. The authors think about several applications of the emotional motion: improve efficiency in HRI, direct people during disasters, better understanding with human partners or help people perform their tasks better. Originality/value This paper presents a method to design robot behaviors with emotion expression. Meanwhile, a similar methodology can be used in other parts (leg, torso, head and so on) of humanoid robots or non-humanoid robots, such as industrial robots.


Author(s):  
Hilberto Ayala ◽  
Yujian Fu

Research in humanoid robot design and implementation is quite challenging due to the complexity of the system and multiple objects involved. Stability, gait generation, navigation and object detection and recognition are all key factors in the humanoid robot design. Researchers in humanoid robot design has put dramatic efforts on one aspect and made assumption on many other aspects. Humanoid robot research involves challenge issues of stability of motion, body movement, navigation, in addition to the issues of path generation, object detection, collision avoidance in the wheeled robots. Rooted from the previous experimental study of wheeled robotics systems, the research project of BIOLOID humanoid robot was started on Fall 2013 and supported by Title III Strengthening Grant Program (HBGI) (DAAD17-02-C-0113). In this paper, we give an overview of the project design and implementation of BIOLOID humanoid robot, including hardware architecture, firmware design and device management, in an overall perspective research work of the motion planning of humanoid robots. In addition, a wide discussion of the issues we faced and challenges of research work is presented, with the results of the current on-going progress. This work will cover the overall hardware architecture, model based system design and behavior analysis using a systematic approach. The work is implemented on a soccer game scenario with a goalie and an offender role. This project has demonstrated a successful development process of collaborative humanoid robotics on a complex research and education platform of BIOLOID using a software engineering approach.


Author(s):  
Elizabeth Phillips ◽  
Daniel Ullman ◽  
Maartje M. A. de Graaf ◽  
Bertram F. Malle

Robot design is a critical component of human-robot interaction. A robot’s appearance shapes people’s expectations of that robot, which in turn affect human-robot interaction. This paper reports on an exploratory analysis of 155 drawings of robots that were collected across three studies. The purpose was to gain a better understanding of people’s a priori expectations about the appearance of robots across a variety of robot types (household, military, humanoid, generic, and AI). The findings suggest that people’s visualizations of robots have common features that can be grouped into five broad components. People seem to distinguish between human-like and machine-like robots, with a default visualization of robots having a human-like appearance. In addition, expectations about robot appearance may be dependent on application domain.


Sign in / Sign up

Export Citation Format

Share Document