Importance of Touch for Conveying Affection in a Multimodal Interaction with a Small Humanoid Robot

2015 ◽  
Vol 12 (01) ◽  
pp. 1550002 ◽  
Author(s):  
Martin D. Cooney ◽  
Shuichi Nishio ◽  
Hiroshi Ishiguro

To be accepted as a part of our everyday lives, companion robots will require the capability to communicate socially, recognizing people's behavior and responding appropriately. In particular, we hypothesized that a humanoid robot should be able to recognize affectionate touches conveying liking or dislike because (a) a humanoid form elicits expectations of a high degree of social intelligence, (b) touch behavior plays a fundamental and crucial role in human bonding, and (c) robotic responses providing affection could contribute to people's quality of life. The hypothesis that people will seek to affectionately touch a robot needed to be verified because robots are typically not soft or warm like humans, and people can communicate through various other modalities such as vision and sound. The main challenge faced was that people's social norms are highly complex, involving behavior in multiple channels. To deal with this challenge, we adopted an approach in which we analyzed free interactions and also asked participants to rate short video-clips depicting human–robot interaction. As a result, we verified that touch plays an important part in the communication of affection from a person to a humanoid robot considered capable of recognizing cues in touch, vision, and sound. Our results suggest that designers of affectionate interactions with a humanoid robot should not ignore the fundamental modality of touch.

2021 ◽  
Vol 7 ◽  
pp. e674
Author(s):  
Jiaji Yang ◽  
Esyin Chew ◽  
Pengcheng Liu

At present, industrial robotics focuses more on motion control and vision, whereas humanoid service robotics (HSRs) are increasingly being investigated and researched in the field of speech interaction. The problem and quality of human-robot interaction (HRI) has become a widely debated topic in academia. Especially when HSRs are applied in the hospitality industry, some researchers believe that the current HRI model is not well adapted to the complex social environment. HSRs generally lack the ability to accurately recognize human intentions and understand social scenarios. This study proposes a novel interactive framework suitable for HSRs. The proposed framework is grounded on the novel integration of Trevarthen’s (2001) companionship theory and neural image captioning (NIC) generation algorithm. By integrating image-to-natural interactivity generation and communicating with the environment to better interact with the stakeholder, thereby changing from interaction to a bionic-companionship. Compared to previous research a novel interactive system is developed based on the bionic-companionship framework. The humanoid service robot was integrated with the system to conduct preliminary tests. The results show that the interactive system based on the bionic-companionship framework can help the service humanoid robot to effectively respond to changes in the interactive environment, for example give different responses to the same character in different scenes.


2019 ◽  
Author(s):  
Cinzia Di Dio ◽  
Federico Manzi ◽  
Giulia Peretti ◽  
Angelo Cangelosi ◽  
Paul L. Harris ◽  
...  

Studying trust within human-robot interaction is of great importance given the social relevance of robotic agents in a variety of contexts. We investigated the acquisition, loss and restoration of trust when preschool and school-age children played with either a human or a humanoid robot in-vivo. The relationship between trust and the quality of attachment relationships, Theory of Mind, and executive function skills was also investigated. No differences were found in children’s trust in the play-partner as a function of agency (human or robot). Nevertheless, 3-years-olds showed a trend toward trusting the human more than the robot, while 7-years-olds displayed the reverse behavioral pattern, thus highlighting the developing interplay between affective and cognitive correlates of trust.


Author(s):  
Giorgio Metta

This chapter outlines a number of research lines that, starting from the observation of nature, attempt to mimic human behavior in humanoid robots. Humanoid robotics is one of the most exciting proving grounds for the development of biologically inspired hardware and software—machines that try to recreate billions of years of evolution with some of the abilities and characteristics of living beings. Humanoids could be especially useful for their ability to “live” in human-populated environments, occupying the same physical space as people and using tools that have been designed for people. Natural human–robot interaction is also an important facet of humanoid research. Finally, learning and adapting from experience, the hallmark of human intelligence, may require some approximation to the human body in order to attain similar capacities to humans. This chapter focuses particularly on compliant actuation, soft robotics, biomimetic robot vision, robot touch, and brain-inspired motor control in the context of the iCub humanoid robot.


2012 ◽  
Vol 24 (9) ◽  
pp. 1867-1883 ◽  
Author(s):  
Bradley R. Buchsbaum ◽  
Sabrina Lemire-Rodger ◽  
Candice Fang ◽  
Hervé Abdi

When we have a rich and vivid memory for a past experience, it often feels like we are transported back in time to witness once again this event. Indeed, a perfect memory would exactly mimic the experiential quality of direct sensory perception. We used fMRI and multivoxel pattern analysis to map and quantify the similarity between patterns of activation evoked by direct perception of a diverse set of short video clips and the vivid remembering, with closed eyes, of these clips. We found that the patterns of distributed brain activation during vivid memory mimicked the patterns evoked during sensory perception. Using whole-brain patterns of activation evoked by perception of the videos, we were able to accurately classify brain patterns that were elicited when participants tried to vividly recall those same videos. A discriminant analysis of the activation patterns associated with each video revealed a high degree (explaining over 80% of the variance) of shared representational similarity between perception and memory. These results show that complex, multifeatured memory involves a partial reinstatement of the whole pattern of brain activity that is evoked during initial perception of the stimulus.


Author(s):  
Margot M. E. Neggers ◽  
Raymond H. Cuijpers ◽  
Peter A. M. Ruijten ◽  
Wijnand A. IJsselsteijn

AbstractAutonomous mobile robots that operate in environments with people are expected to be able to deal with human proxemics and social distances. Previous research investigated how robots can approach persons or how to implement human-aware navigation algorithms. However, experimental research on how robots can avoid a person in a comfortable way is largely missing. The aim of the current work is to experimentally determine the shape and size of personal space of a human passed by a robot. In two studies, both a humanoid as well as a non-humanoid robot were used to pass a person at different sides and distances, after which they were asked to rate their perceived comfort. As expected, perceived comfort increases with distance. However, the shape was not circular: passing at the back of a person is more uncomfortable compared to passing at the front, especially in the case of the humanoid robot. These results give us more insight into the shape and size of personal space in human–robot interaction. Furthermore, they can serve as necessary input to human-aware navigation algorithms for autonomous mobile robots in which human comfort is traded off with efficiency goals.


2020 ◽  
Vol 12 (1) ◽  
pp. 58-73
Author(s):  
Sofia Thunberg ◽  
Tom Ziemke

AbstractInteraction between humans and robots will benefit if people have at least a rough mental model of what a robot knows about the world and what it plans to do. But how do we design human-robot interactions to facilitate this? Previous research has shown that one can change people’s mental models of robots by manipulating the robots’ physical appearance. However, this has mostly not been done in a user-centred way, i.e. without a focus on what users need and want. Starting from theories of how humans form and adapt mental models of others, we investigated how the participatory design method, PICTIVE, can be used to generate design ideas about how a humanoid robot could communicate. Five participants went through three phases based on eight scenarios from the state-of-the-art tasks in the RoboCup@Home social robotics competition. The results indicate that participatory design can be a suitable method to generate design concepts for robots’ communication in human-robot interaction.


Author(s):  
Stefan Schiffer ◽  
Alexander Ferrein

In this work we report on our effort to design and implement an early introduction to basic robotics principles for children at kindergarten age.  The humanoid robot Pepper, which is a great platform for human-robot interaction experiments, was presenting the lecture by reading out the contents to the children making use of its speech synthesis capability.  One of the main challenges of this effort was to explain complex robotics contents in a way that pre-school children could follow the basic principles and ideas using examples from their world of experience. A quiz in a Runaround-game-show style after the lecture activated the children to recap the contents  they acquired about how mobile robots work in principle. Besides the thrill being exposed to a mobile robot that would also react to the children, they were very excited and at the same time very concentrated. What sets apart our effort from other work is that part of the lecturing is actually done by a robot itself and that a quiz at the end of the lesson is done using robots as well. To the best of our knowledge this is one of only few attempts to use Pepper not as a tele-teaching tool, but as the teacher itself in order to engage pre-school children with complex robotics contents. We  got very positive feedback from the children as well as from their educators.


2020 ◽  
pp. 1556-1572
Author(s):  
Jordi Vallverdú ◽  
Toyoaki Nishida ◽  
Yoshisama Ohmoto ◽  
Stuart Moran ◽  
Sarah Lázare

Empathy is a basic emotion trigger for human beings, especially while regulating social relationships and behaviour. The main challenge of this paper is study whether people's empathic reactions towards robots change depending on previous information given to human about the robot before the interaction. The use of false data about robot skills creates different levels of what we call ‘fake empathy'. This study performs an experiment in WOZ environment in which different subjects (n=17) interacting with the same robot while they believe that the robot is a different robot, up to three versions. Each robot scenario provides a different ‘humanoid' description, and out hypothesis is that the more human-like looks the robot, the more empathically can be the human responses. Results were obtained from questionnaires and multi- angle video recordings. Positive results reinforce the strength of our hypothesis, although we recommend a new and bigger and then more robust experiment.


Sign in / Sign up

Export Citation Format

Share Document