scholarly journals Active Inference Through Energy Minimization in Multimodal Affective Human–Robot Interaction

2021 ◽  
Vol 8 ◽  
Author(s):  
Takato Horii ◽  
Yukie Nagai

During communication, humans express their emotional states using various modalities (e.g., facial expressions and gestures), and they estimate the emotional states of others by paying attention to multimodal signals. To ensure that a communication robot with limited resources can pay attention to such multimodal signals, the main challenge involves selecting the most effective modalities among those expressed. In this study, we propose an active perception method that involves selecting the most informative modalities using a criterion based on energy minimization. This energy-based model can learn the probability of the network state using energy values, whereby a lower energy value represents a higher probability of the state. A multimodal deep belief network, which is an energy-based model, was employed to represent the relationships between the emotional states and multimodal sensory signals. Compared to other active perception methods, the proposed approach demonstrated improved accuracy using limited information in several contexts associated with affective human–robot interaction. We present the differences and advantages of our method compared to other methods through mathematical formulations using, for example, information gain as a criterion. Further, we evaluate performance of our method, as pertains to active inference, which is based on the free energy principle. Consequently, we establish that our method demonstrated superior performance in tasks associated with mutually correlated multimodal information.

2020 ◽  
pp. 1556-1572
Author(s):  
Jordi Vallverdú ◽  
Toyoaki Nishida ◽  
Yoshisama Ohmoto ◽  
Stuart Moran ◽  
Sarah Lázare

Empathy is a basic emotion trigger for human beings, especially while regulating social relationships and behaviour. The main challenge of this paper is study whether people's empathic reactions towards robots change depending on previous information given to human about the robot before the interaction. The use of false data about robot skills creates different levels of what we call ‘fake empathy'. This study performs an experiment in WOZ environment in which different subjects (n=17) interacting with the same robot while they believe that the robot is a different robot, up to three versions. Each robot scenario provides a different ‘humanoid' description, and out hypothesis is that the more human-like looks the robot, the more empathically can be the human responses. Results were obtained from questionnaires and multi- angle video recordings. Positive results reinforce the strength of our hypothesis, although we recommend a new and bigger and then more robust experiment.


2018 ◽  
Vol 14 (1) ◽  
pp. 44-59 ◽  
Author(s):  
Jordi Vallverdú ◽  
Toyoaki Nishida ◽  
Yoshisama Ohmoto ◽  
Stuart Moran ◽  
Sarah Lázare

Empathy is a basic emotion trigger for human beings, especially while regulating social relationships and behaviour. The main challenge of this paper is study whether people's empathic reactions towards robots change depending on previous information given to human about the robot before the interaction. The use of false data about robot skills creates different levels of what we call ‘fake empathy'. This study performs an experiment in WOZ environment in which different subjects (n=17) interacting with the same robot while they believe that the robot is a different robot, up to three versions. Each robot scenario provides a different ‘humanoid' description, and out hypothesis is that the more human-like looks the robot, the more empathically can be the human responses. Results were obtained from questionnaires and multi- angle video recordings. Positive results reinforce the strength of our hypothesis, although we recommend a new and bigger and then more robust experiment.


2021 ◽  
Vol 11 (8) ◽  
pp. 3468
Author(s):  
Jaeryoung Lee

The use of affective speech in robotic applications has increased in recent years, especially regarding the developments or studies of emotional prosody for a specific group of people. The current work proposes a prosody-based communication system that considers the limited parameters found in speech recognition for the elderly, for example. This work explored what types of voices were more effective for understanding presented information, and if the affects of robot voices reflected on the emotional states of listeners. By using functions of a small humanoid robot, two different experiments conducted to find out comprehension level and the affective reflection respectively. University students participated in both tests. The results showed that affective voices helped the users understand the information, as well as that they felt corresponding negative emotions in conversations with negative voices.


2021 ◽  
Vol 28 (2) ◽  
pp. 125-146

With the recent developments of technology and the advances in artificial intelligent and machine learning techniques, it becomes possible for the robot to acquire and show the emotions as a part of Human-Robot Interaction (HRI). An emotional robot can recognize the emotional states of humans so that it will be able to interact more naturally with its human counterpart in different environments. In this article, a survey on emotion recognition for HRI systems has been presented. The survey aims to achieve two objectives. Firstly, it aims to discuss the main challenges that face researchers when building emotional HRI systems. Secondly, it seeks to identify sensing channels that can be used to detect emotions and provides a literature review about recent researches published within each channel, along with the used methodologies and achieved results. Finally, some of the existing emotion recognition issues and recommendations for future works have been outlined.


2000 ◽  
Vol 29 (544) ◽  
Author(s):  
Dolores Canamero ◽  
Jakob Fredslund

We report work on a LEGO robot capable of displaying several emotional expressions in response to physical contact. Our motivation has been to explore believable emotional exchanges to achieve plausible interaction with a simple robot. We have worked toward this goal in two ways. <p>First, acknowledging the importance of physical manipulation in children's interactions, interaction with the robot is through tactile stimulation; the various kinds of stimulation that can elicit the robot's emotions are grounded in a model of emotion activation based on different stimulation patterns.</p><p>Second, emotional states need to be clearly conveyed. We have drawn inspiration from theories of human basic emotions with associated universal facial expressions, which we have implemented in a caricaturized face. We have conducted experiments on both children and adults to assess the recognizability of these expressions.</p>


Author(s):  
Paramin Neranon

This research focuses on the development of the conceptual frameworks of human-human interaction applied for a robotic behaviour-based approach for safe physical human-robot interaction. The control has been constructed based on understanding the dynamic and kinematic behavioural characteristics of how two humans pass an object to each other. This has enabled a KR-16-KUKA robot to naturally interact with a human so as to facilitate the dexterous transfer of an object in an effective manner. Implicit force control based on Proportional Integral and Fuzzy Logic Control which allows the robot end effector’s trajectory to be moderated based on the applied force in real-time was adopted. The experimental results have confirmed that the quantitative performance of the force-controlled robot is close to that of the human and can be considered acceptable for human-robot interaction. Furthermore, the control based Fuzzy Logic Control was shown to be slightly superior performance compared to Proportional Integral control.


2015 ◽  
Vol 12 (01) ◽  
pp. 1550002 ◽  
Author(s):  
Martin D. Cooney ◽  
Shuichi Nishio ◽  
Hiroshi Ishiguro

To be accepted as a part of our everyday lives, companion robots will require the capability to communicate socially, recognizing people's behavior and responding appropriately. In particular, we hypothesized that a humanoid robot should be able to recognize affectionate touches conveying liking or dislike because (a) a humanoid form elicits expectations of a high degree of social intelligence, (b) touch behavior plays a fundamental and crucial role in human bonding, and (c) robotic responses providing affection could contribute to people's quality of life. The hypothesis that people will seek to affectionately touch a robot needed to be verified because robots are typically not soft or warm like humans, and people can communicate through various other modalities such as vision and sound. The main challenge faced was that people's social norms are highly complex, involving behavior in multiple channels. To deal with this challenge, we adopted an approach in which we analyzed free interactions and also asked participants to rate short video-clips depicting human–robot interaction. As a result, we verified that touch plays an important part in the communication of affection from a person to a humanoid robot considered capable of recognizing cues in touch, vision, and sound. Our results suggest that designers of affectionate interactions with a humanoid robot should not ignore the fundamental modality of touch.


2020 ◽  
Vol 10 (8) ◽  
pp. 2924 ◽  
Author(s):  
Chiara Filippini ◽  
David Perpetuini ◽  
Daniela Cardone ◽  
Antonio Maria Chiarelli ◽  
Arcangelo Merla

Over recent years, robots are increasingly being employed in several aspects of modern society. Among others, social robots have the potential to benefit education, healthcare, and tourism. To achieve this purpose, robots should be able to engage humans, recognize users’ emotions, and to some extent properly react and "behave" in a natural interaction. Most robotics applications primarily use visual information for emotion recognition, which is often based on facial expressions. However, the display of emotional states through facial expression is inherently a voluntary controlled process that is typical of human–human interaction. In fact, humans have not yet learned to use this channel when communicating with a robotic technology. Hence, there is an urgent need to exploit emotion information channels not directly controlled by humans, such as those that can be ascribed to physiological modulations. Thermal infrared imaging-based affective computing has the potential to be the solution to such an issue. It is a validated technology that allows the non-obtrusive monitoring of physiological parameters and from which it might be possible to infer affective states. This review is aimed to outline the advantages and the current research challenges of thermal imaging-based affective computing for human–robot interaction.


Sign in / Sign up

Export Citation Format

Share Document