scholarly journals Multimodal adapted robot behavior synthesis within a narrative human-robot interaction

Author(s):  
Amir Aly ◽  
Adriana Tapus
2019 ◽  
Author(s):  
Jairo Pérez-Osorio ◽  
Davide De Tommaso ◽  
Ebru Baykara ◽  
Agnieszka Wykowska

Robots will soon enter social environments shared with humans. We need robots that are able to efficiently convey social signals during interactions. At the same time, we need to understand the impact of robots’ behavior on the human brain. For this purpose, human behavioral and neural responses to the robot behavior should be quantified offering feedback on how to improve and adjust robot behavior. Under this premise, our approach is to use methods of experimental psychology and cognitive neuroscience to assess the human’s reception of a robot in human-robot interaction protocols. As an example of this approach, we report an adaptation of a classical paradigm of experimental cognitive psychology to a naturalistic human- robot interaction scenario. We show the feasibility of such an approach with a validation pilot study, which demonstrated that our design yielded a similar pattern of data to what has been previously observed in experiments within the area of cognitive psychology. Our approach allows for addressing specific mechanisms of human cognition that are elicited during human-robot interaction, and thereby, in a longer-term perspective, it will allow for designing robots that are well- attuned to the workings of the human brain.


2020 ◽  
Vol 32 (1) ◽  
pp. 224-235
Author(s):  
Wei-Fen Hsieh ◽  
◽  
Eri Sato-Shimokawara ◽  
Toru Yamaguchi

In our daily conversation, we obtain considerable information from our interlocutor’s non-verbal behaviors, such as gaze and gestures. Several studies have shown that nonverbal messages are prominent factors in smoothing the process of human-robot interaction. Our previous studies have shown that not only a robot’s appearance but also its gestures, tone, and other nonverbal factors influence a person’s impression of it. The paper presented an analysis of the impressions made when human motions are implemented on a humanoid robot, and experiments were conducted to evaluate impressions made by robot expressions to analyze the sensations. The results showed the relation between robot expression patterns and human preferences. To further investigate biofeedback elicited by different robot styles of expression, a scenario-based experiment was done. The results revealed that people’s emotions can definitely be affected by robot behavior, and the robot’s way of expressing itself is what most influences whether or not it is perceived as friendly. The results show that it is potentially useful to combine our concept into a robot system to meet individual needs.


2008 ◽  
Vol 24 (4) ◽  
pp. 911-916 ◽  
Author(s):  
N. Mitsunaga ◽  
C. Smith ◽  
T. Kanda ◽  
H. Ishiguro ◽  
N. Hagita

Author(s):  
Lue-Feng Chen ◽  
◽  
Zhen-Tao Liu ◽  
Min Wu ◽  
Fangyan Dong ◽  
...  

A multi-robot behavior adaptation mechanism that adapts to human intention is proposed for human-robot interaction (HRI), where information-driven fuzzy friend-Q learning (IDFFQ) is used to generate an optimal behavior-selection policy, and intention is understood mainly based on human emotions. This mechanism aims to endow robots with human-oriented interaction capabilities to understand and adapt their behaviors to human intentions. It also decreases the response time (RT) of robots by embedding the human identification information such as religion for behavior selection, and increases the satisfaction of humans by considering their deep-level information, including intention and emotion, so as to make interactions run smoothly. Experiments is performed in a scenario of drinking at a bar. Results show that the learning steps of the proposal is 51 steps less than that of the fuzzy production rule based friend-Q learning (FPRFQ), and the robots’ RT is about 25% of the time consumed by FPRFQ. Additionally, emotion recognition and intention understanding achieved an accuracy of 80.36% and 85.71%, respectively. Moreover, a subjective evaluation of customers through a questionnaire obtains a reaction of “satisfied.” Based on these preliminary experiments, the proposal is being extended to service robots for behavior adaptation to customers’ intention to drink at a bar.


Author(s):  
Yuan Wei ◽  
Jing Zhao

Purpose This paper aims to deal with the problem of designing robot behaviors (mainly to robotic arms) to express emotions. The authors study the effects of robot behaviors from our humanoid robot NAO on the subject’s emotion expression in human–robot interaction (HRI). Design/methodology/approach A method to design robot behavior through the movement primitives is proposed. Then, a novel dimensional affective model is built. Finally, the concept of action semantics is adopted to combine the robot behaviors with emotion expression. Findings For the evaluation of this combination, the authors assess positive (excited and happy) and negative (frightened and sad) emotional patterns on 20 subjects which are divided into two groups (whether they were familiar with robots). The results show that the recognition of the different emotion patterns does not have differences between the two groups and the subjects could recognize the robot behaviors with emotions. Practical implications Using affective models to guide robots’ behavior or express their intentions is highly beneficial in human–robot interaction. The authors think about several applications of the emotional motion: improve efficiency in HRI, direct people during disasters, better understanding with human partners or help people perform their tasks better. Originality/value This paper presents a method to design robot behaviors with emotion expression. Meanwhile, a similar methodology can be used in other parts (leg, torso, head and so on) of humanoid robots or non-humanoid robots, such as industrial robots.


Sign in / Sign up

Export Citation Format

Share Document