Improved delay-dependent stability conditions for recurrent neural networks with multiple time-varying delays

2014 ◽  
Vol 78 (2) ◽  
pp. 803-812 ◽  
Author(s):  
Yonggang Chen ◽  
Shumin Fei ◽  
Yongmin Li
2014 ◽  
Vol 2014 ◽  
pp. 1-7 ◽  
Author(s):  
Lei Ding ◽  
Hong-Bing Zeng ◽  
Wei Wang ◽  
Fei Yu

This paper investigates the stability of static recurrent neural networks (SRNNs) with a time-varying delay. Based on the complete delay-decomposing approach and quadratic separation framework, a novel Lyapunov-Krasovskii functional is constructed. By employing a reciprocally convex technique to consider the relationship between the time-varying delay and its varying interval, some improved delay-dependent stability conditions are presented in terms of linear matrix inequalities (LMIs). Finally, a numerical example is provided to show the merits and the effectiveness of the proposed methods.


2012 ◽  
Vol 2012 ◽  
pp. 1-14 ◽  
Author(s):  
Shu Lv ◽  
Junkang Tian ◽  
Shouming Zhong

This paper concerns the problem of delay-dependent stability criteria for recurrent neural networks with time varying delays. By taking more information of states and activation functions as augmented vectors, a new class of the Lyapunov functional is proposed. Then, some less conservative stability criteria are obtained in terms of linear matrix inequalities (LMIs). Finally, two numerical examples are given to illustrate the effectiveness of the proposed method.


Sign in / Sign up

Export Citation Format

Share Document