Fabrication of robot head module using contact-resistance force sensor for human-robot interaction and its evaluation

2012 ◽  
Vol 26 (10) ◽  
pp. 3269-3276 ◽  
Author(s):  
Dong-Ki Kim ◽  
Jong-Ho Kim ◽  
Hyun-Joon Kwon ◽  
Young-Ha Kwon
2014 ◽  
Vol 2014 ◽  
pp. 1-5 ◽  
Author(s):  
Jizheng Yan ◽  
Zhiliang Wang ◽  
Yan Yan

Emotional robots are always the focus of artificial intelligence (AI), and intelligent control of robot facial expression is a hot research topic. This paper focuses on the design of humanoid robot head, which is divided into three steps to achieve. The first step is to solve the uncanny valley about humanoid robot, to find and avoid the relationship between human being and robot; the second step is to solve the association between human face and robot head; compared with human being and robots, we analyze the similarities and differences and explore the same basis and mechanisms between robot and human analyzing the Facial Action Coding System (FACS), which guides us to achieve humanoid expressions. On the basis of the previous two steps, the third step is to construct a robot head; through a series of experiments we test the robot head, which could show some humanoid expressions; through human-robot interaction, we find people are surprised by the robot head expression and feel happy.


Author(s):  
Barbara Gonsior ◽  
◽  
Christian Landsiedel ◽  
Nicole Mirnig ◽  
Stefan Sosnowski ◽  
...  

This work is a first step towards an integration ofmultimodality with the aim to make efficient use of both human-like, and non-human-like feedback modalities in order to optimize proactive information retrieval from task-related Human-Robot Interaction (HRI) in human environments. The presented approach combines the human-like modalities speech and emotional facial mimicry with non-human-like modalities. The proposed non-human-like modalities are a screen displaying retrieved knowledge of the robot to the human and a pointer mounted above the robot head for pointing directions and referring to objects in shared visual space as an equivalent for arm and hand gestures. Initially, pre-interaction feedback is explored in an experiment investigating different approach behaviors in order to find socially acceptable trajectories to increase the success of interactions and thus efficiency of information retrieval. Secondly, pre-evaluated humanlike modalities are introduced. First results of a multimodal feedback study are presented in the context of the IURO project,1where a robot asks for its way to a predefined goal location.1. Interactive Urban Robot, http://www.iuro-project.eu


Sensors ◽  
2017 ◽  
Vol 17 (6) ◽  
pp. 1294 ◽  
Author(s):  
Victor Grosu ◽  
Svetlana Grosu ◽  
Bram Vanderborght ◽  
Dirk Lefeber ◽  
Carlos Rodriguez-Guerrero

Author(s):  
Yulai Weng ◽  
Andrew Specian ◽  
Mark Yim

This paper presents the design of a low cost system that can be used as a spherical humanoid robot head to display expressive animations for social robotics. The system offers a versatile canvas for Human Robot Interaction (HRI), especially for face to face communication. To maximize flexibility, both in the style and apparent motion of the robot’s head, we exploit the relatively recent availability of low-cost portable projectors in a retro-projected animated face (RAF). The optical mechanical system is comprised of a projector whose light is reflected off a hemispherical mirror and onto a 360 degree section of the spherical head with sufficient resolution and illumination. We derive the forward and inverse mapping relation between the pixel coordinates on the projection plane of the projector, and the outer spherical surface to offer fast graphic generation. Calibration of the system is achieved by controlling three parameters of image translation and scaling, resulting in a specifically devised light cone whose edges are tangential to the hemispherical mirror. Several facial expressions are tested in illuminated indoor environments to show its potential as a modular low cost robot head for HRI.


Sign in / Sign up

Export Citation Format

Share Document