scholarly journals Partial yaw moment compensation through whole-body motion

Author(s):  
Rafael Cisneros ◽  
Eiichi Yoshida ◽  
Kazuhito Yokoi
Keyword(s):  
2016 ◽  
Vol 13 (01) ◽  
pp. 1550039 ◽  
Author(s):  
Patrick M. Wensing ◽  
David E. Orin

The control of centroidal momentum has recently emerged as an important component of whole-body humanoid control, resulting in emergent upper-body motions and increased robustness to pushes when included in whole-body frameworks. Previous work has developed specialized computational algorithms for the centroidal momentum matrix (CMM) and its derivative, which relate rates of change in centroidal momentum to joint rates and accelerations of the humanoid. This paper instead shows that specialized algorithms are in fact not always required. Since the dynamics of the centroidal momentum are embedded in the joint-space dynamic equations of motion, the CMM and terms involving its derivative can be computed from the joint-space mass matrix and Coriolis terms. This new approach presents improvements in terms of its generality, compactness, and efficiency in comparison to previous specialized algorithms. The new computation method is then applied to perform whole-body control of a dynamic kicking motion, where the mass matrix and Coriolis terms are already required by the controller. This example motivates how centroidal momentum can be used as an aggregate descriptor of motion in order to ease whole-body motion authoring from a task-space perspective. It further demonstrates emergent upper-body motion from centroidal angular momentum (CAM) control that is shown to provide desirable regulation of the net yaw moment under the foot. Finally, a few perspectives are provided on the use of centroidal momentum control.


Author(s):  
Miguel Arduengo ◽  
Ana Arduengo ◽  
Adria Colome ◽  
Joan Lobo-Prat ◽  
Carme Torras
Keyword(s):  

2018 ◽  
pp. 1575-1599 ◽  
Author(s):  
Eiichi Yoshida ◽  
Fumio Kanehiro ◽  
Jean-Paul Laumond

1994 ◽  
Vol 6 (2) ◽  
pp. 99-116 ◽  
Author(s):  
M. W. Oram ◽  
D. I. Perrett

Cells have been found in the superior temporal polysensory area (STPa) of the macaque temporal cortex that are selectively responsive to the sight of particular whole body movements (e.g., walking) under normal lighting. These cells typically discriminate the direction of walking and the view of the body (e.g., left profile walking left). We investigated the extent to which these cells are responsive under “biological motion” conditions where the form of the body is defined only by the movement of light patches attached to the points of limb articulation. One-third of the cells (25/72) selective for the form and motion of walking bodies showed sensitivity to the moving light displays. Seven of these cells showed only partial sensitivity to form from motion, in so far as the cells responded more to moving light displays than to moving controls but failed to discriminate body view. These seven cells exhibited directional selectivity. Eighteen cells showed statistical discrimination for both direction of movement and body view under biological motion conditions. Most of these cells showed reduced responses to the impoverished moving light stimuli compared to full light conditions. The 18 cells were thus sensitive to detailed form information (body view) from the pattern of articulating motion. Cellular processing of the global pattern of articulation was indicated by the observations that none of these cells were found sensitive to movement of individual limbs and that jumbling the pattern of moving limbs reduced response magnitude. A further 10 cells were tested for sensitivity to moving light displays of whole body actions other than walking. Of these cells 5/10 showed selectivity for form displayed by biological motion stimuli that paralleled the selectivity under normal lighting conditions. The cell responses thus provide direct evidence for neural mechanisms computing form from nonrigid motion. The selectivity of the cells was for body view, specific direction, and specific type of body motion presented by moving light displays and is not predicted by many current computational approaches to the extraction of form from motion.


Author(s):  
Joseph Salini ◽  
Sébastien Barthélemy ◽  
Philippe Bidaud

2017 ◽  
Vol 118 (4) ◽  
pp. 2499-2506 ◽  
Author(s):  
A. Pomante ◽  
L. P. J. Selen ◽  
W. P. Medendorp

The vestibular system provides information for spatial orientation. However, this information is ambiguous: because the otoliths sense the gravitoinertial force, they cannot distinguish gravitational and inertial components. As a consequence, prolonged linear acceleration of the head can be interpreted as tilt, referred to as the somatogravic effect. Previous modeling work suggests that the brain disambiguates the otolith signal according to the rules of Bayesian inference, combining noisy canal cues with the a priori assumption that prolonged linear accelerations are unlikely. Within this modeling framework the noise of the vestibular signals affects the dynamic characteristics of the tilt percept during linear whole-body motion. To test this prediction, we devised a novel paradigm to psychometrically characterize the dynamic visual vertical—as a proxy for the tilt percept—during passive sinusoidal linear motion along the interaural axis (0.33 Hz motion frequency, 1.75 m/s2peak acceleration, 80 cm displacement). While subjects ( n=10) kept fixation on a central body-fixed light, a line was briefly flashed (5 ms) at different phases of the motion, the orientation of which had to be judged relative to gravity. Consistent with the model’s prediction, subjects showed a phase-dependent modulation of the dynamic visual vertical, with a subject-specific phase shift with respect to the imposed acceleration signal. The magnitude of this modulation was smaller than predicted, suggesting a contribution of nonvestibular signals to the dynamic visual vertical. Despite their dampening effect, our findings may point to a link between the noise components in the vestibular system and the characteristics of dynamic visual vertical.NEW & NOTEWORTHY A fundamental question in neuroscience is how the brain processes vestibular signals to infer the orientation of the body and objects in space. We show that, under sinusoidal linear motion, systematic error patterns appear in the disambiguation of linear acceleration and spatial orientation. We discuss the dynamics of these illusory percepts in terms of a dynamic Bayesian model that combines uncertainty in the vestibular signals with priors based on the natural statistics of head motion.


Author(s):  
ChangHyun Sung ◽  
Takahiro Kagawa ◽  
Yoji Uno

AbstractIn this paper, we propose an effective planning method for whole-body motions of humanoid robots under various conditions for achieving the task. In motion planning, various constraints such as range of motion have to be considered. Specifically, it is important to maintain balance in whole-body motion. In order to be useful in an unpredictable environment, rapid planning is an essential problem. In this research, via-point representation is used for assigning sufficient conditions to deal with various constraints in the movement. The position, posture and velocity of the robot are constrained as a state of a via-point. In our algorithm, the feasible motions are planned by modifying via-points. Furthermore, we formulate the motion planning problem as a simple iterative method with a Linear Programming (LP) problem for efficiency of the motion planning. We have applied the method to generate the kicking motion of a HOAP-3 humanoid robot. We confirmed that the robot can successfully score a goal with various courses corresponding to changing conditions of the location of an obstacle. The computation time was less than two seconds. These results indicate that the proposed algorithm can achieve efficient motion planning.


Sign in / Sign up

Export Citation Format

Share Document