robot head
Recently Published Documents


TOTAL DOCUMENTS

122
(FIVE YEARS 17)

H-INDEX

14
(FIVE YEARS 1)

Author(s):  
Nguyen Khac Toan ◽  
Le Duc Thuan ◽  
Le Bao Long ◽  
Nguyen Truong Thinh
Keyword(s):  

i-com ◽  
2020 ◽  
Vol 19 (2) ◽  
pp. 153-167
Author(s):  
Martin Westhoven ◽  
Tim van der Grinten

AbstractIn this paper we report results from a web- and video-based study on the perception of a request for help from a robot head. Colored lights, eye-expressions and politeness of the used language were varied. We measured effects on expression identification, hedonic user experience, perceived politeness, and help intention. Additionally, sociodemographic data, a ‘face blindness’ questionnaire, and negative attitudes towards robots were collected to control for possible influences on the dependent variables. A total of n = 139 participants were included in the analysis. In this paper, the focus is placed on interaction effects and on the influence of covariates. Significant effects were found for the interaction of LED lighting and eye-expressions and for language and eye-expressions on help intention. The expression identification is significantly influenced by the interaction of LED lighting and eye-expressions. Several significant effects of the covariates were found, both direct and from interaction with independent variables. Especially the negative attitudes towards robots significantly influence help intention and perceived politeness. The results provide information on the effect of different design choices for help requesting robots.


2020 ◽  
Vol 17 (2) ◽  
pp. 172988142091149
Author(s):  
Birger Johansson ◽  
Trond A Tjøstheim ◽  
Christian Balkenius

Epi is a humanoid robot developed by Lund University Cognitive Science Robotics Group. It was designed to be used in experiments in developmental robotics and has proportions to give a childlike impression while still being decidedly robotic. The robot head has two degrees of freedom in the neck and each eye can independently move laterally. There is a camera in each eye to make stereovision possible. The arms are designed to resemble those of a human. Each arm has five degrees of freedom, three in the shoulder, one in the elbow and one in the wrist. The hands have four movable fingers and a stationary thumb. A force distribution mechanism inside the hand connect a single servo to the movable fingers and makes sure the hand closes around an object regardless of its shape. The rigid parts of the hands are 3D printed in PLA and HIPS while the flexible parts, including the joints and the tendons, are made from polyurethane rubber. The control system for Epi is based on neurophysiological data and is implemented using the Ikaros system. Most of the sensory and motor processing is done at 40 Hz to allow smooth movements. The irises of the eyes can change colour and the pupils can dilate and contract. There is also a grid of LEDs that resembles a mouth that can be animated by changing colour and intensity.


2020 ◽  
Vol 32 (1) ◽  
pp. 97-112 ◽  
Author(s):  
Masahiko Mikawa ◽  
Jiayi Lyu ◽  
Makoto Fujisawa ◽  
Wasuke Hiiragi ◽  
Toyoyuki Ishibashi ◽  
...  

When a robot works among people in a public space, its behavior can make some people feel uncomfortable. One of the reasons for this is that it is difficult for people to understand the robot’s intended behavior based on its appearance. This paper presents a new intention expression method using a three dimensional computer graphics (3D CG) face model. The 3D CG face model is displayed on a flat panel screen and has two eyes and a head that can be rotated freely. When the mobile robot is about to change its traveling direction, the robot rotates its head and eyes in the direction it intends to go, so that an oncoming person can know the robot’s intention from this previous announcement. Three main types of experiment were conducted, to confirm the validity and effectiveness of our proposed previous announcement method using the face interface. First, an appropriate timing for the previous announcement was determined from impression evaluations as a preliminary experiment. Secondly, differences between two experiments, in which a pedestrian and the robot passed each other in a corridor both with and without the previous announcement, were evaluated as main experiments of this study. Finally, differences between our proposed face interface and the conventional robot head were analyzed as a reference experiments. The experimental results confirmed the validity and effectiveness of the proposed method.


Author(s):  
Tudor Catalin Apostolescu ◽  
Ioana Udrea ◽  
Georgeta Ionascu ◽  
Silviu Petrache ◽  
Laurentiu Adrian Cartal ◽  
...  
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document