scholarly journals Robot Gaze Behavior Affects Honesty in Human-Robot Interaction

2021 ◽  
Vol 4 ◽  
Author(s):  
Elef Schellen ◽  
Francesco Bossi ◽  
Agnieszka Wykowska

As the use of humanoid robots proliferates, an increasing amount of people may find themselves face-to-“face” with a robot in everyday life. Although there is a plethora of information available on facial social cues and how we interpret them in the field of human-human social interaction, we cannot assume that these findings flawlessly transfer to human-robot interaction. Therefore, more research on facial cues in human-robot interaction is required. This study investigated deception in human-robot interaction context, focusing on the effect that eye contact with a robot has on honesty toward this robot. In an iterative task, participants could assist a humanoid robot by providing it with correct information, or potentially secure a reward for themselves by providing it with incorrect information. Results show that participants are increasingly honest after the robot establishes eye contact with them, but only if this is in response to deceptive behavior. Behavior is not influenced by the establishment of eye contact if the participant is actively engaging in honest behavior. These findings support the notion that humanoid robots can be perceived as, and treated like, social agents, since the herein described effect mirrors one present in human-human social interaction.

2021 ◽  
Author(s):  
Elef Schellen ◽  
Francesco Bossi ◽  
Agnieszka Wykowska

As the use of humanoid robots proliferates, an increasing amount of people may find themselves face-to-“face” with a robot in everyday life. Although there is a plethora of information available on facial social cues and how we interpret them in the field of human-human social interaction, we cannot assume that these findings flawlessly transfer to human-robot interaction. Therefore, more research on facial cues in human-robot interaction is required. This study investigated deception in human-robot interaction context, focusing on the effect that eye contact with a robot has on honesty towards this robot. In an iterative task, participants could assist a humanoid robot by providing it with correct information, or potentially secure a reward for themselves by providing it with incorrect information. Results show that participants are increasingly honest after the robot establishes eye contact with them, but only if this is in response to deceptive behavior. Behavior is not influenced by the establishment of eye contact if the participant is actively engaging in honest behavior. These findings support the notion that humanoid robots can be perceived as, and treated like, social agents, since the herein described effect mirrors one present in human-human social interaction.


2009 ◽  
Vol 6 (3-4) ◽  
pp. 369-397 ◽  
Author(s):  
Kerstin Dautenhahn ◽  
Chrystopher L. Nehaniv ◽  
Michael L. Walters ◽  
Ben Robins ◽  
Hatice Kose-Bagci ◽  
...  

This paper provides a comprehensive introduction to the design of the minimally expressive robot KASPAR, which is particularly suitable for human–robot interaction studies. A low-cost design with off-the-shelf components has been used in a novel design inspired from a multi-disciplinary viewpoint, including comics design and Japanese Noh theatre. The design rationale of the robot and its technical features are described in detail. Three research studies will be presented that have been using KASPAR extensively. Firstly, we present its application in robot-assisted play and therapy for children with autism. Secondly, we illustrate its use in human–robot interaction studies investigating the role of interaction kinesics and gestures. Lastly, we describe a study in the field of developmental robotics into computational architectures based on interaction histories for robot ontogeny. The three areas differ in the way as to how the robot is being operated and its role in social interaction scenarios. Each will be introduced briefly and examples of the results will be presented. Reflections on the specific design features of KASPAR that were important in these studies and lessons learnt from these studies concerning the design of humanoid robots for social interaction will also be discussed. An assessment of the robot in terms of utility of the design for human–robot interaction experiments concludes the paper.


Author(s):  
Giorgio Metta

This chapter outlines a number of research lines that, starting from the observation of nature, attempt to mimic human behavior in humanoid robots. Humanoid robotics is one of the most exciting proving grounds for the development of biologically inspired hardware and software—machines that try to recreate billions of years of evolution with some of the abilities and characteristics of living beings. Humanoids could be especially useful for their ability to “live” in human-populated environments, occupying the same physical space as people and using tools that have been designed for people. Natural human–robot interaction is also an important facet of humanoid research. Finally, learning and adapting from experience, the hallmark of human intelligence, may require some approximation to the human body in order to attain similar capacities to humans. This chapter focuses particularly on compliant actuation, soft robotics, biomimetic robot vision, robot touch, and brain-inspired motor control in the context of the iCub humanoid robot.


2020 ◽  
Vol 12 (1) ◽  
pp. 58-73
Author(s):  
Sofia Thunberg ◽  
Tom Ziemke

AbstractInteraction between humans and robots will benefit if people have at least a rough mental model of what a robot knows about the world and what it plans to do. But how do we design human-robot interactions to facilitate this? Previous research has shown that one can change people’s mental models of robots by manipulating the robots’ physical appearance. However, this has mostly not been done in a user-centred way, i.e. without a focus on what users need and want. Starting from theories of how humans form and adapt mental models of others, we investigated how the participatory design method, PICTIVE, can be used to generate design ideas about how a humanoid robot could communicate. Five participants went through three phases based on eight scenarios from the state-of-the-art tasks in the RoboCup@Home social robotics competition. The results indicate that participatory design can be a suitable method to generate design concepts for robots’ communication in human-robot interaction.


Author(s):  
Faezeh Rahbar ◽  
Salvatore M. Anzalone ◽  
Giovanna Varni ◽  
Elisabetta Zibetti ◽  
Serena Ivaldi ◽  
...  

Author(s):  
Mark Tee Kit Tsun ◽  
Lau Bee Theng ◽  
Hudyjaya Siswoyo Jo ◽  
Patrick Then Hang Hui

This chapter summarizes the findings of a study on robotics research and application for assisting children with disabilities between the years 2009 and 2013. The said disabilities include impairment of motor skills, locomotion, and social interaction that is commonly attributed to children suffering from Autistic Spectrum Disorders (ASD) and Cerebral Palsy (CP). As opposed to assistive technologies for disabilities that largely account for restoration of physical capabilities, disabled children also require dedicated rehabilitation for social interaction and mental health. As such, the breadth of this study covers existing efforts in rehabilitation of both physical and socio-psychological domains, which involve Human-Robot Interaction. Overviewed topics include assisted locomotion training, passive stretching and active movement rehabilitation, upper-extremity motor function, social interactivity, therapist-mediators, active play encouragement, as well as several life-long assistive robotics in current use. This chapter concludes by drawing attention to ethical and adoption issues that may obstruct the field's effectiveness.


2019 ◽  
Vol 374 (1771) ◽  
pp. 20180033 ◽  
Author(s):  
Birgit Rauchbauer ◽  
Bruno Nazarian ◽  
Morgane Bourhis ◽  
Magalie Ochs ◽  
Laurent Prévot ◽  
...  

We present a novel functional magnetic resonance imaging paradigm for second-person neuroscience. The paradigm compares a human social interaction (human–human interaction, HHI) to an interaction with a conversational robot (human–robot interaction, HRI). The social interaction consists of 1 min blocks of live bidirectional discussion between the scanned participant and the human or robot agent. A final sample of 21 participants is included in the corpus comprising physiological (blood oxygen level-dependent, respiration and peripheral blood flow) and behavioural (recorded speech from all interlocutors, eye tracking from the scanned participant, face recording of the human and robot agents) data. Here, we present the first analysis of this corpus, contrasting neural activity between HHI and HRI. We hypothesized that independently of differences in behaviour between interactions with the human and robot agent, neural markers of mentalizing (temporoparietal junction (TPJ) and medial prefrontal cortex) and social motivation (hypothalamus and amygdala) would only be active in HHI. Results confirmed significantly increased response associated with HHI in the TPJ, hypothalamus and amygdala, but not in the medial prefrontal cortex. Future analysis of this corpus will include fine-grained characterization of verbal and non-verbal behaviours recorded during the interaction to investigate their neural correlates. This article is part of the theme issue ‘From social brains to social robots: applying neurocognitive insights to human–robot interaction'.


Author(s):  
Ali Momen ◽  
Eva Wiese

Social robots with expressive gaze have positive effects on human-robot interaction. In particular, research suggests that when robots are programmed to express introverted or extroverted gaze behavior, individuals enjoy interacting more with robots that match their personality. However, how this affects social-cognitive performance during human-robot interactions has not been thoroughly examined yet. In the current paper, we examine whether the perceived match between human and robot personality positively affects the degree to which the robot’s gaze is followed (i.e., gaze cueing, as a proxy for more complex social-cognitive behavior). While social attention has been examined extensively outside of human-robot interaction, recent research shows that a robot’s gaze is attended to in a similar way as a human’s gaze. While our results did not support the hypothesis that gaze cueing would be strongest when the participant’s personality matched the robot’s personality, we did find evidence that participants followed the gaze of introverted robots more strongly than the gaze of extroverted robots. This finding suggests that agent’s displaying extroverted gaze behavior may hurt performance in human-robot interaction.


Author(s):  
Louise LePage

AbstractStage plays, theories of theatre, narrative studies, and robotics research can serve to identify, explore, and interrogate theatrical elements that support the effective performance of sociable humanoid robots. Theatre, including its parts of performance, aesthetics, character, and genre, can also reveal features of human–robot interaction key to creating humanoid robots that are likeable rather than uncanny. In particular, this can be achieved by relating Mori's (1970/2012) concept of total appearance to realism. Realism is broader and more subtle in its workings than is generally recognised in its operationalization in studies that focus solely on appearance. For example, it is complicated by genre. A realistic character cast in a detective drama will convey different qualities and expectations than the same character in a dystopian drama or romantic comedy. The implications of realism and genre carry over into real life. As stage performances and robotics studies reveal, likeability depends on creating aesthetically coherent representations of character, where all the parts coalesce to produce a socially identifiable figure demonstrating predictable behaviour.


2012 ◽  
Vol 3 (2) ◽  
pp. 68-83 ◽  
Author(s):  
David K. Grunberg ◽  
Alyssa M. Batula ◽  
Erik M. Schmidt ◽  
Youngmoo E. Kim

The recognition and display of synthetic emotions in humanoid robots is a critical attribute for facilitating natural human-robot interaction. The authors utilize an efficient algorithm to estimate the mood in acoustic music, and then use the results of that algorithm to drive movement generation systems to provide motions for the robot that are suitable for the music. This system is evaluated on multiple sets of humanoid robots to determine if the choice of robot platform or number of robots influences the perceived emotional content of the motions. Their tests verify that the authors’ system can accurately identify the emotional content of acoustic music and produce motions that convey a similar emotion to that in the audio. They also determine the perceptual effects of using different sized or different numbers of robots in the motion performances.


Sign in / Sign up

Export Citation Format

Share Document