Improved Stability Criteria for Neural Networks with Two Additive Time-Varying Delay Components

2013 ◽  
Vol 32 (4) ◽  
pp. 1977-1990 ◽  
Author(s):  
Huabin Chen
2009 ◽  
Vol 373 (3) ◽  
pp. 342-348 ◽  
Author(s):  
Jian Sun ◽  
G.P. Liu ◽  
Jie Chen ◽  
D. Rees

2009 ◽  
Vol 2009 ◽  
pp. 1-23 ◽  
Author(s):  
Zixin Liu ◽  
Shu Lv ◽  
Shouming Zhong ◽  
Mao Ye

The robust stability of uncertain discrete-time recurrent neural networks with time-varying delay is investigated. By decomposing some connection weight matrices, new Lyapunov-Krasovskii functionals are constructed, and serial new improved stability criteria are derived. These criteria are formulated in the forms of linear matrix inequalities (LMIs). Compared with some previous results, the new results are less conservative. Three numerical examples are provided to demonstrate the less conservatism and effectiveness of the proposed method.


2014 ◽  
Vol 2014 ◽  
pp. 1-7 ◽  
Author(s):  
Lei Ding ◽  
Hong-Bing Zeng ◽  
Wei Wang ◽  
Fei Yu

This paper investigates the stability of static recurrent neural networks (SRNNs) with a time-varying delay. Based on the complete delay-decomposing approach and quadratic separation framework, a novel Lyapunov-Krasovskii functional is constructed. By employing a reciprocally convex technique to consider the relationship between the time-varying delay and its varying interval, some improved delay-dependent stability conditions are presented in terms of linear matrix inequalities (LMIs). Finally, a numerical example is provided to show the merits and the effectiveness of the proposed methods.


2014 ◽  
Vol 138 ◽  
pp. 383-391 ◽  
Author(s):  
Meng-Di Ji ◽  
Yong He ◽  
Chuan-Ke Zhang ◽  
Min Wu

Sign in / Sign up

Export Citation Format

Share Document