Convergence analysis of Zhang neural networks solving time-varying linear equations but without using time-derivative information

Author(s):  
Yunong Zhang ◽  
Yanyan Shi ◽  
Yiwen Yang ◽  
Zhende Ke
2019 ◽  
Vol 2019 (1) ◽  
Author(s):  
Sun Min ◽  
Liu Jing

Abstract In this paper, to solve the time-varying Sylvester tensor equations (TVSTEs) with noise, we will design three noise-tolerant continuous-time Zhang neural networks (NTCTZNNs), termed NTCTZNN1, NTCTZNN2, NTCTZNN3, respectively. The most important characteristic of these neural networks is that they make full use of the time-derivative information of the TVSTEs’ coefficients. Theoretical analyses show that no matter how large the unknown noise is, the residual error generated by NTCTZNN2 converges globally to zero. Meanwhile, as long as the design parameter is large enough, the residual errors generated by NTCTZNN1 and NTCTZNN3 can be arbitrarily small. For comparison, the gradient-based neural network (GNN) is also presented and analyzed to solve TVSTEs. Numerical examples and results demonstrate the efficacy and superiority of the proposed neural networks.


2016 ◽  
Vol 28 (12) ◽  
pp. 2790-2824 ◽  
Author(s):  
Xue-Zhong Wang ◽  
Yimin Wei ◽  
Predrag S. Stanimirović

Two complex Zhang neural network (ZNN) models for computing the Drazin inverse of arbitrary time-varying complex square matrix are presented. The design of these neural networks is based on corresponding matrix-valued error functions arising from the limit representations of the Drazin inverse. Two types of activation functions, appropriate for handling complex matrices, are exploited to develop each of these networks. Theoretical results of convergence analysis are presented to show the desirable properties of the proposed complex-valued ZNN models. Numerical results further demonstrate the effectiveness of the proposed models.


Sign in / Sign up

Export Citation Format

Share Document