Discrete-time Zhang neural network and numerical algorithm for time-varying quadratic minimization

Author(s):  
Yunong Zhang ◽  
Bingguo Mu ◽  
Huicheng Zheng
2019 ◽  
Vol 2019 ◽  
pp. 1-12
Author(s):  
Min Sun ◽  
Jing Liu

This article presents a general six-step discrete-time Zhang neural network (ZNN) for time-varying tensor absolute value equations. Firstly, based on the Taylor expansion theory, we derive a general Zhang et al. discretization (ZeaD) formula, i.e., a general Taylor-type 1-step-ahead numerical differentiation rule for the first-order derivative approximation, which contains two free parameters. Based on the bilinear transform and the Routh–Hurwitz stability criterion, the effective domain of the two free parameters is analyzed, which can ensure the convergence of the general ZeaD formula. Secondly, based on the general ZeaD formula, we design a general six-step discrete-time ZNN (DTZNN) for time-varying tensor absolute value equations (TVTAVEs), whose steady-state residual error changes in a higher order manner than those presented in the literature. Meanwhile, the feasible region of its step size, which determines its convergence, is also studied. Finally, experiment results corroborate that the general six-step DTZNN model is quite efficient for TVTAVE solving.


Sign in / Sign up

Export Citation Format

Share Document