scholarly journals Training of deep neural networks for the generation of dynamic movement primitives

2020 ◽  
Vol 127 ◽  
pp. 121-131 ◽  
Author(s):  
Rok Pahič ◽  
Barry Ridge ◽  
Andrej Gams ◽  
Jun Morimoto ◽  
Aleš Ude
Author(s):  
Weiyong Si ◽  
Ning Wang ◽  
Chenguang Yang

AbstractIn this paper, composite dynamic movement primitives (DMPs) based on radial basis function neural networks (RBFNNs) are investigated for robots’ skill learning from human demonstrations. The composite DMPs could encode the position and orientation manipulation skills simultaneously for human-to-robot skills transfer. As the robot manipulator is expected to perform tasks in unstructured and uncertain environments, it requires the manipulator to own the adaptive ability to adjust its behaviours to new situations and environments. Since the DMPs can adapt to uncertainties and perturbation, and spatial and temporal scaling, it has been successfully employed for various tasks, such as trajectory planning and obstacle avoidance. However, the existing skill model mainly focuses on position or orientation modelling separately; it is a common constraint in terms of position and orientation simultaneously in practice. Besides, the generalisation of the skill learning model based on DMPs is still hard to deal with dynamic tasks, e.g., reaching a moving target and obstacle avoidance. In this paper, we proposed a composite DMPs-based framework representing position and orientation simultaneously for robot skill acquisition and the neural networks technique is used to train the skill model. The effectiveness of the proposed approach is validated by simulation and experiments.


2020 ◽  
Vol 53 (5) ◽  
pp. 265-270
Author(s):  
Xian Li ◽  
Chenguang Yang ◽  
Ying Feng

2021 ◽  
Author(s):  
Tiantian Wang ◽  
Liang Yan ◽  
Gang Wang ◽  
Xiaoshan Gao ◽  
Nannan Du ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document