Expenential stability of recurrent neural networks with time-varying discrete and distributed delays

Author(s):  
Liu Yonghua ◽  
Luo Wenguang
2014 ◽  
Vol 2014 ◽  
pp. 1-7 ◽  
Author(s):  
Lei Ding ◽  
Hong-Bing Zeng ◽  
Wei Wang ◽  
Fei Yu

This paper investigates the stability of static recurrent neural networks (SRNNs) with a time-varying delay. Based on the complete delay-decomposing approach and quadratic separation framework, a novel Lyapunov-Krasovskii functional is constructed. By employing a reciprocally convex technique to consider the relationship between the time-varying delay and its varying interval, some improved delay-dependent stability conditions are presented in terms of linear matrix inequalities (LMIs). Finally, a numerical example is provided to show the merits and the effectiveness of the proposed methods.


Complexity ◽  
2017 ◽  
Vol 2017 ◽  
pp. 1-12 ◽  
Author(s):  
Xiaohui Xu ◽  
Jiye Zhang ◽  
Quan Xu ◽  
Zilong Chen ◽  
Weifan Zheng

This paper studies the global exponential stability for a class of impulsive disturbance complex-valued Cohen-Grossberg neural networks with both time-varying delays and continuously distributed delays. Firstly, the existence and uniqueness of the equilibrium point of the system are analyzed by using the corresponding property of M-matrix and the theorem of homeomorphism mapping. Secondly, the global exponential stability of the equilibrium point of the system is studied by applying the vector Lyapunov function method and the mathematical induction method. The established sufficient conditions show the effects of both delays and impulsive strength on the exponential convergence rate. The obtained results in this paper are with a lower level of conservatism in comparison with some existing ones. Finally, three numerical examples with simulation results are given to illustrate the correctness of the proposed results.


Sign in / Sign up

Export Citation Format

Share Document