A Unified Framework for Finite-Time and Fixed-Time Stabilization of Neural Networks with General Activations and External Disturbances

2018 ◽  
Vol 38 (3) ◽  
pp. 1005-1022 ◽  
Author(s):  
Nan Jiang ◽  
Xiaoyang Liu ◽  
Jinde Cao
2019 ◽  
Vol 33 (28) ◽  
pp. 1950343 ◽  
Author(s):  
Zhilian Yan ◽  
Youmei Zhou ◽  
Xia Huang ◽  
Jianping Zhou

This paper addresses the issue of finite-time boundedness for time-delay neural networks with external disturbances via weight learning. With the help of a group of inequalities and combining with the Lyapunov theory, weight learning rules are devised to ensure the neural networks to be finite-time bounded for the fixed connection weight matrix case and the fixed delayed connection weight matrix case, respectively. Sufficient conditions on the existence of the desired learning rules are presented in the form of linear matrix inequalities, which are easily verified by MATLAB software. It is shown that the proposed learning rules also guarantee the finite-time stability of the time-delay neural networks. Finally, a numerical example is employed to show the applicability of the devised weight learning rules.


2014 ◽  
Vol 2014 ◽  
pp. 1-9 ◽  
Author(s):  
Chao Zhang ◽  
Qiang Guo ◽  
Jing Wang

This paper addresses the finite-time synchronizing problem for a class of chaotic neural networks. In a real communication network, parameters of the master system may be time-varying and the system may be perturbed by external disturbances. A simple high-gain observer is designed to track all the nonlinearities, unknown system functions, and disturbances. Then, a dynamic active compensatory controller is proposed and by using the singular perturbation theory, the control method can guarantee the finite-time stability of the error system between the master system and the slave system. Finally, two illustrative examples are provided to show the effectiveness and applicability of the proposed scheme.


Sign in / Sign up

Export Citation Format

Share Document