scholarly journals Biologically Inspired Multimodal Integration: Interferences in a Human-Robot Interaction Game

Author(s):  
Eric Sauser ◽  
Aude Billard
Author(s):  
Giorgio Metta

This chapter outlines a number of research lines that, starting from the observation of nature, attempt to mimic human behavior in humanoid robots. Humanoid robotics is one of the most exciting proving grounds for the development of biologically inspired hardware and software—machines that try to recreate billions of years of evolution with some of the abilities and characteristics of living beings. Humanoids could be especially useful for their ability to “live” in human-populated environments, occupying the same physical space as people and using tools that have been designed for people. Natural human–robot interaction is also an important facet of humanoid research. Finally, learning and adapting from experience, the hallmark of human intelligence, may require some approximation to the human body in order to attain similar capacities to humans. This chapter focuses particularly on compliant actuation, soft robotics, biomimetic robot vision, robot touch, and brain-inspired motor control in the context of the iCub humanoid robot.


Sensors ◽  
2018 ◽  
Vol 18 (8) ◽  
pp. 2691 ◽  
Author(s):  
Marcos Maroto-Gómez ◽  
Álvaro Castro-González ◽  
José Castillo ◽  
María Malfaz ◽  
Miguel Salichs

Nowadays, many robotic applications require robots making their own decisions and adapting to different conditions and users. This work presents a biologically inspired decision making system, based on drives, motivations, wellbeing, and self-learning, that governs the behavior of the robot considering both internal and external circumstances. In this paper we state the biological foundations that drove the design of the system, as well as how it has been implemented in a real robot. Following a homeostatic approach, the ultimate goal of the robot is to keep its wellbeing as high as possible. In order to achieve this goal, our decision making system uses learning mechanisms to assess the best action to execute at any moment. Considering that the proposed system has been implemented in a real social robot, human-robot interaction is of paramount importance and the learned behaviors of the robot are oriented to foster the interactions with the user. The operation of the system is shown in a scenario where the robot Mini plays games with a user. In this context, we have included a robust user detection mechanism tailored for short distance interactions. After the learning phase, the robot has learned how to lead the user to interact with it in a natural way.


Author(s):  
Gianpaolo Gulletta ◽  
Wolfram Erlhagen ◽  
Estela Bicho

In the last decade, the objectives outlined by the needs of personal robotics have led to the rise of new biologically-inspired techniques for arm motion planning. This paper presents a literature review of the most recent research on the generation of human-like arm movements in humanoid and manipulation robotic systems. Search methods and inclusion criteria are described. The studies are analysed taking into consideration the sources of publication, the experimental settings, the type of movements, the technical approach, and the human motor principles that have been used to inspire and assess human-likeness. Results show that there is a strong focus on the generation of single-arm reaching movements and biomimetic-based methods. However, there has been poor attention to manipulation, obstacle-avoidance mechanisms, and dual-arm motion generation. For these reasons, human-like arm motion generation may not fully respect human behavioural and neurological key features and may result restricted to specific tasks of human-robot interaction. Limitations and challenges are discussed to provide meaningful directions for future investigations.


Robotics ◽  
2020 ◽  
Vol 9 (4) ◽  
pp. 102
Author(s):  
Gianpaolo Gulletta ◽  
Wolfram Erlhagen ◽  
Estela Bicho

In the last decade, the objectives outlined by the needs of personal robotics have led to the rise of new biologically-inspired techniques for arm motion planning. This paper presents a literature review of the most recent research on the generation of human-like arm movements in humanoid and manipulation robotic systems. Search methods and inclusion criteria are described. The studies are analyzed taking into consideration the sources of publication, the experimental settings, the type of movements, the technical approach, and the human motor principles that have been used to inspire and assess human-likeness. Results show that there is a strong focus on the generation of single-arm reaching movements and biomimetic-based methods. However, there has been poor attention to manipulation, obstacle-avoidance mechanisms, and dual-arm motion generation. For these reasons, human-like arm motion generation may not fully respect human behavioral and neurological key features and may result restricted to specific tasks of human-robot interaction. Limitations and challenges are discussed to provide meaningful directions for future investigations.


Author(s):  
Marko Wehle ◽  
Alexandra Weidemann ◽  
Ivo Wilhelm Boblan

Robotic developments are seen as a next level in technology with intelligent machines, which automate tedious tasks and serve our needs without complaints. But nevertheless, they have to be fair and smart enough to be intuitively of use and safe to handle. But how to implement this kind of intelligence, does it need feelings and emotions, should robots perceive the world as we do as a human role model, how far should the implementation of synthetic consciousness lead and actually, what is needed for consciousness in that context? Additionally in Human-Robot-Interaction research, science mainly makes use of the tool phenomenography, which is exclusively subjective, so how to make it qualify for Artificial Intelligence? These are the heading aspects of this chapter for conducting research in the field of social robotics and suggesting a conscious and cognitive model for smart and intuitive interacting robots, guided by biomimetics.


2020 ◽  
pp. 1507-1532
Author(s):  
Marko Wehle ◽  
Alexandra Weidemann ◽  
Ivo Wilhelm Boblan

Robotic developments are seen as a next level in technology with intelligent machines, which automate tedious tasks and serve our needs without complaints. But nevertheless, they have to be fair and smart enough to be intuitively of use and safe to handle. But how to implement this kind of intelligence, does it need feelings and emotions, should robots perceive the world as we do as a human role model, how far should the implementation of synthetic consciousness lead and actually, what is needed for consciousness in that context? Additionally in Human-Robot-Interaction research, science mainly makes use of the tool phenomenography, which is exclusively subjective, so how to make it qualify for Artificial Intelligence? These are the heading aspects of this chapter for conducting research in the field of social robotics and suggesting a conscious and cognitive model for smart and intuitive interacting robots, guided by biomimetics.


2011 ◽  
Vol 23 (3) ◽  
pp. 313-325 ◽  
Author(s):  
S Davis ◽  
Darwin G Caldwell

As the operation of robotic systems moves away from solely manufacturing environments to arenas where they must operate alongside humans, so the essential characteristics of their design has transformed. A move from traditional robot designs to more inherently safe concepts is required. Studying biological systems to determine how they achieve safe interactions is one approach being used. This then seeks to mimic the ingredients that make this interaction safe in robotics systems. This is often achieved through softness both in terms of a soft fleshy external covering and through motor systems that introduce joint compliance for softer physical Human-Robot Interaction (pHRI). This has led to the development of new actuators with performance characteristics that at least on a macroscopic level try to emulate the function of organic muscle. One of the most promising among these is the pneumatic Muscle Actuator (pMA). However, as with organic muscle, these soft actuators are more susceptible to damage than many traditional actuators. Whilst organic muscle can regenerate and recover, artificial systems do not possess this ability. This article analyzes how organic muscle is able to operate even after extreme trauma and shows how functionally similar techniques can be used with pMAs.


2019 ◽  
Vol 374 (1771) ◽  
pp. 20180031 ◽  
Author(s):  
Yasuo Kuniyoshi

Human-centred AI/Robotics are quickly becoming important. Their core claim is that AI systems or robots must be designed and work for the benefits of humans with no harm or uneasiness. It essentially requires the realization of autonomy, sociality and their fusion at all levels of system organization, even beyond programming or pre-training. The biologically inspired core principle of such a system is described as the emergence and development of embodied behaviour and cognition. The importance of embodiment, emergence and continuous autonomous development is explained in the context of developmental robotics and dynamical systems view of human development. We present a hypothetical early developmental scenario that fills in the very beginning part of the comprehensive scenarios proposed in developmental robotics. Then our model and experiments on emergent embodied behaviour are presented. They consist of chaotic maps embedded in sensory–motor loops and coupled via embodiment. Behaviours that are consistent with embodiment and adaptive to environmental structure emerge within a few seconds without any external reward or learning. Next, our model and experiments on human fetal development are presented. A precise musculo-skeletal fetal body model is placed in a uterus model. Driven by spinal nonlinear oscillator circuits coupled together via embodiment, somatosensory signals are evoked and learned by a model of the cerebral cortex with 2.6 million neurons and 5.3 billion synapses. The model acquired cortical representations of self–body and multi-modal sensory integration. This work is important because it models very early autonomous development in realistic detailed human embodiment. Finally, discussions toward human-like cognition are presented including other important factors such as motivation, emotion, internal organs and genetic factors. This article is part of the theme issue ‘From social brains to social robots: applying neurocognitive insights to human–robot interaction’.


Author(s):  
Levern Q. Currie ◽  
Eva Wiese

Robotic agents are becoming increasingly pervasive in society, and have already begun advancing fields such as healthcare, education, and industry. However, despite their potential to do good for society, many people still feel unease when imaging a future where robots and humans work and live together in shared environments, partly because robots are not generally trusted or ascribed human-like socio-emotional skills such as mentalizing and empathizing. In addition, performing tasks conjointly with robots can be frustrating and ineffective partially due to the fact that neuronal networks involved in action understanding and execution (i.e., the action-perception network; APN) are underactivated in human-robot interaction (HRI). While a number of studies has linked underactivation in APN to reduced abilities to predict a robot’s actions, little is known about how performing a competitive task together with a robot affects one’s own ability to execute or suppress an action. In the current experiment, we use a Go/No-Go task that requires participants to give a response on Go trials and suppress a response on No-Go trials to examine whether the performance of human players is impacted by whether they play the game against a robot believed to be controlled by a human as opposed to being pre-programmed. Preliminary data shows higher false alarm rates on No-Go trials, higher hit rates on Go trials, longer reaction times on Go trials and higher inverse efficiency scores in the human-controlled versus the pre-programmed condition. The results show that mind perception (here: perceiving actions as human-controlled) significantly impacted action execution of human players in a competitive human-robot interaction game.


Sign in / Sign up

Export Citation Format

Share Document