Emotion-based biped walking

Robotica ◽  
2004 ◽  
Vol 22 (5) ◽  
pp. 577-586 ◽  
Author(s):  
Hun-ok Lim ◽  
Akinori Ishii ◽  
Atsuo Takanishi

This paper describes emotion-based walking for a biped humanoid robot. In this paper, three emotions, such as happiness, sadness and anger are considered. These emotions are expressed by the walking styles of the biped humanoid robot that are preset by the parameterization of its whole body motion. To keep its balance during the emotional expressions, the motion of the trunk is employed which is calculated by the compensatory motion control based on the motions of the head, arms and legs. We have constructed a biped humanoid robot, WABIAB-RII (WAseda BIpedal humANoid robot-Revised II), to explore the issue of the emotional walking motion for a smooth and natural communication. WABIAN-RII has forty-three mechanical degrees of freedom and four passive degrees of freedom. Its height is about 1.84 m and its total weight is 127 kg. Using WABIAN-RII, three emotion expressions are experimented by the biped walking, including the body motion, and evaluated.

2013 ◽  
Vol 479-480 ◽  
pp. 617-621
Author(s):  
Hsien I Lin ◽  
Zan Sheng Chen

Human-to-Humanoid motion imitation is an intuitive method to teach a humanoid robot how to act by human demonstration. For example, teaching a robot how to stand is simply showing the robot how a human stands. Much of previous work in motion imitation focuses on either upper-body or lower-body motion imitation. In this paper, we propose a novel approach to imitate human whole-body motion by a humanoid robot. The main problem of the proposed work is how to control robot balance and keep the robot motion as similar as taught human motion simultaneously. Thus, we propose a balance criterion to assess how well the root can balance and use the criterion and a genetic algorithm to search a sub-optimal solution, making the root balanced and its motion similar to human motion. We have validated the proposed work on an Aldebaran Robotics NAO robot with 25 degrees of freedom. The experimental results show that the root can imitate human postures and autonomously keep itself balanced.


2019 ◽  
Author(s):  
Meghan E. Huber ◽  
Enrico Chiovetto ◽  
Martin Giese ◽  
Dagmar Sternad

ABSTRACTMaintaining balance while walking on a narrow beam is a challenging motor task. This is presumably because the foot’s ability to exert torque on the support surface is limited by the beam width. Still, the feet serve as a critical interface between the body and the external environment, and it is unclear how the mechanical properties of the feet affect balance. Here we examined how restricting the degrees of freedom of the feet influenced balance behavior during beam walking. We recorded whole-body joint kinematics of subjects with varying skill levels as they walked on a narrow beam with and without wearing flat, rigid soles on their feet. We computed changes in whole-body motion and angular momentum across these conditions. Results showed that wearing rigid soles improved balance in the beam walking task, but that practice with rigid soles did not affect or transfer to task performance with bare feet. The absence of any after-effect suggested that the improved balance from constraining the foot was the result of a mechanical effect rather than a change in neural strategy. Though wearing rigid soles can be used to assist balance, there appear to be limited training or rehabilitation benefits from wearing rigid soles.


2005 ◽  
Author(s):  
Sunghwan Chon ◽  
Samsu Lim ◽  
Kyumann Im ◽  
Woonchul Ham

1994 ◽  
Vol 6 (2) ◽  
pp. 99-116 ◽  
Author(s):  
M. W. Oram ◽  
D. I. Perrett

Cells have been found in the superior temporal polysensory area (STPa) of the macaque temporal cortex that are selectively responsive to the sight of particular whole body movements (e.g., walking) under normal lighting. These cells typically discriminate the direction of walking and the view of the body (e.g., left profile walking left). We investigated the extent to which these cells are responsive under “biological motion” conditions where the form of the body is defined only by the movement of light patches attached to the points of limb articulation. One-third of the cells (25/72) selective for the form and motion of walking bodies showed sensitivity to the moving light displays. Seven of these cells showed only partial sensitivity to form from motion, in so far as the cells responded more to moving light displays than to moving controls but failed to discriminate body view. These seven cells exhibited directional selectivity. Eighteen cells showed statistical discrimination for both direction of movement and body view under biological motion conditions. Most of these cells showed reduced responses to the impoverished moving light stimuli compared to full light conditions. The 18 cells were thus sensitive to detailed form information (body view) from the pattern of articulating motion. Cellular processing of the global pattern of articulation was indicated by the observations that none of these cells were found sensitive to movement of individual limbs and that jumbling the pattern of moving limbs reduced response magnitude. A further 10 cells were tested for sensitivity to moving light displays of whole body actions other than walking. Of these cells 5/10 showed selectivity for form displayed by biological motion stimuli that paralleled the selectivity under normal lighting conditions. The cell responses thus provide direct evidence for neural mechanisms computing form from nonrigid motion. The selectivity of the cells was for body view, specific direction, and specific type of body motion presented by moving light displays and is not predicted by many current computational approaches to the extraction of form from motion.


Robotica ◽  
2005 ◽  
Vol 24 (2) ◽  
pp. 257-268 ◽  
Author(s):  
Hun-ok Lim ◽  
Sang-ho Hyon ◽  
Samuel A. Setiawan ◽  
Atsuo Takanishi

Our goal is to develop biped humanoid robots capable of working stably in a human living and working space, with a focus on their physical construction and motion control. At the first stage, we have developed a human-like biped robot, WABIAN (WAseda BIped humANoid), which has a thirty-five mechanical degrees of freedom. Its height is 1.66 [m] and its weight 107.4 [kg]. In this paper, a moment compensation method is described for stability, which is based on the motion of its head, legs and arms. Also, a follow walking method is proposed which is based on a pattern switching technique. By a combination of both methods, the biped robot is able to perform dynamic stamping, walking forward and backward in a continuous time while someone is pushing or pulling its hand in such a way. Using WABIAN, human-fellow walking experiments are conducted, and the effectiveness of the methods are verified.


2017 ◽  
Vol 118 (4) ◽  
pp. 2499-2506 ◽  
Author(s):  
A. Pomante ◽  
L. P. J. Selen ◽  
W. P. Medendorp

The vestibular system provides information for spatial orientation. However, this information is ambiguous: because the otoliths sense the gravitoinertial force, they cannot distinguish gravitational and inertial components. As a consequence, prolonged linear acceleration of the head can be interpreted as tilt, referred to as the somatogravic effect. Previous modeling work suggests that the brain disambiguates the otolith signal according to the rules of Bayesian inference, combining noisy canal cues with the a priori assumption that prolonged linear accelerations are unlikely. Within this modeling framework the noise of the vestibular signals affects the dynamic characteristics of the tilt percept during linear whole-body motion. To test this prediction, we devised a novel paradigm to psychometrically characterize the dynamic visual vertical—as a proxy for the tilt percept—during passive sinusoidal linear motion along the interaural axis (0.33 Hz motion frequency, 1.75 m/s2peak acceleration, 80 cm displacement). While subjects ( n=10) kept fixation on a central body-fixed light, a line was briefly flashed (5 ms) at different phases of the motion, the orientation of which had to be judged relative to gravity. Consistent with the model’s prediction, subjects showed a phase-dependent modulation of the dynamic visual vertical, with a subject-specific phase shift with respect to the imposed acceleration signal. The magnitude of this modulation was smaller than predicted, suggesting a contribution of nonvestibular signals to the dynamic visual vertical. Despite their dampening effect, our findings may point to a link between the noise components in the vestibular system and the characteristics of dynamic visual vertical.NEW & NOTEWORTHY A fundamental question in neuroscience is how the brain processes vestibular signals to infer the orientation of the body and objects in space. We show that, under sinusoidal linear motion, systematic error patterns appear in the disambiguation of linear acceleration and spatial orientation. We discuss the dynamics of these illusory percepts in terms of a dynamic Bayesian model that combines uncertainty in the vestibular signals with priors based on the natural statistics of head motion.


Sign in / Sign up

Export Citation Format

Share Document