scholarly journals Recursive Identification for Fractional Order Hammerstein Model Based on ADELS

2021 ◽  
Vol 2021 ◽  
pp. 1-16
Author(s):  
Qibing Jin ◽  
Youliang Ye ◽  
Wu Cai ◽  
Zeyu Wang

This paper deals with the identification of the fractional order Hammerstein model by using proposed adaptive differential evolution with the Local search strategy (ADELS) algorithm with the steepest descent method and the overparameterization based auxiliary model recursive least squares (OAMRLS) algorithm. The parameters of the static nonlinear block and the dynamic linear block of the model are all unknown, including the fractional order. The initial value of the parameter is obtained by the proposed ADELS algorithm. The main innovation of ADELS is to adaptively generate the next generation based on the fitness function value within the population through scoring rules and introduce Chebyshev mapping into the newly generated population for local search. Based on the steepest descent method, the fractional order identification using initial values is derived. The remaining parameters are derived through the OAMRLS algorithm. With the initial value obtained by ADELS, the identification result of the algorithm is more accurate. The simulation results illustrate the significance of the proposed algorithm.

1996 ◽  
Vol 3 (3) ◽  
pp. 201-209 ◽  
Author(s):  
Chinmoy Pal ◽  
Ichiro Hagiwara ◽  
Naoki Kayaba ◽  
Shin Morishita

A theoretical formulation of a fast learning method based on a pseudoinverse technique is presented. The efficiency and robustness of the method are verified with the help of an Exclusive OR problem and a dynamic system identification of a linear single degree of freedom mass–spring problem. It is observed that, compared with the conventional backpropagation method, the proposed method has a better convergence rate and a higher degree of learning accuracy with a lower equivalent learning coefficient. It is also found that unlike the steepest descent method, the learning capability of which is dependent on the value of the learning coefficient ν, the proposed pseudoinverse based backpropagation algorithm is comparatively robust with respect to its equivalent variable learning coefficient. A combination of the pseudoinverse method and the steepest descent method is proposed for a faster, more accurate learning capability.


Sign in / Sign up

Export Citation Format

Share Document