Further Results on Exponential Robust Stability Analysis for Recurrent Neural Networks With Time-Varying Delay

2015 ◽  
Vol 137 (4) ◽  
Author(s):  
Pin-Lin Liu

In this paper, the problems of determining the robust exponential stability and estimating the exponential convergence rate for recurrent neural networks (RNNs) with parametric uncertainties and time-varying delay are studied. The relationship among the time-varying delay, its upper bound, and their difference is taken into account. The developed stability conditions are in terms of linear matrix inequalities (LMIs) and the integral inequality approach (IIA), which can be checked easily by recently developed algorithms solving LMIs. Furthermore, the proposed stability conditions are less conservative than some recently known ones in the literature, and this has been demonstrated via four examples with simulation.

2014 ◽  
Vol 2014 ◽  
pp. 1-7 ◽  
Author(s):  
Lei Ding ◽  
Hong-Bing Zeng ◽  
Wei Wang ◽  
Fei Yu

This paper investigates the stability of static recurrent neural networks (SRNNs) with a time-varying delay. Based on the complete delay-decomposing approach and quadratic separation framework, a novel Lyapunov-Krasovskii functional is constructed. By employing a reciprocally convex technique to consider the relationship between the time-varying delay and its varying interval, some improved delay-dependent stability conditions are presented in terms of linear matrix inequalities (LMIs). Finally, a numerical example is provided to show the merits and the effectiveness of the proposed methods.


2010 ◽  
Vol 24 (04n05) ◽  
pp. 503-511 ◽  
Author(s):  
S. M. LEE

In this paper, we propose a new robust stability analysis method for uncertain cellular neural networks with time-varying delay. The proposed stability criterion is based on the Lyapunov function with sector bounded nonlinear function. The sufficient condition for the stability is derived in terms of LMI (linear matrix inequality). Numerical examples show the effectiveness of the proposed method.


2009 ◽  
Vol 2009 ◽  
pp. 1-14 ◽  
Author(s):  
Chien-Yu Lu ◽  
Chin-Wen Liao ◽  
Hsun-Heng Tsai

This paper examines a passivity analysis for a class of discrete-time recurrent neural networks (DRNNs) with norm-bounded time-varying parameter uncertainties and interval time-varying delay. The activation functions are assumed to be globally Lipschitz continuous. Based on an appropriate type of Lyapunov functional, sufficient passivity conditions for the DRNNs are derived in terms of a family of linear matrix inequalities (LMIs). Two numerical examples are given to illustrate the effectiveness and applicability.


2013 ◽  
Vol 2013 ◽  
pp. 1-7
Author(s):  
Wenguang Luo ◽  
Xiuling Wang ◽  
Yonghua Liu ◽  
Hongli Lan

The problem of global exponential stability for recurrent neural networks with time-varying delay is investigated. By dividing the time delay interval [0,τ(t)] intoK+1dynamical subintervals, a new Lyapunov-Krasovskii functional is introduced; then, a novel linear-matrix-inequality (LMI-) based delay-dependent exponential stability criterion is derived, which is less conservative than some previous literatures (Zhang et al., 2005; He et al., 2006; and Wu et al., 2008). An illustrate example is finally provided to show the effectiveness and the advantage of the proposed result.


2015 ◽  
Vol 2015 ◽  
pp. 1-7 ◽  
Author(s):  
Wei Wang ◽  
Hong-Bing Zeng

This paper is focused on the absolute stability of Lur’e systems with time-varying delay. Based on the quadratic separation framework, a complete delay-decomposing Lyapunov-Krasovskii functional is constructed. By considering the relationship between the time-varying delay and its varying interval, improved delay-dependent absolute stability conditions in terms of linear matrix inequalities (LMIs) are obtained. Moreover, the derived conditions are extended to systems with time-varying structured uncertainties. Finally, a numerical example is given to show the advantage over existing literatures.


2015 ◽  
Vol 742 ◽  
pp. 399-403
Author(s):  
Ya Jun Li ◽  
Jing Zhao Li

This paper investigates the exponential stability problem for a class of stochastic neural networks with leakage delay. By employing a suitable Lyapunov functional and stochastic stability theory technic, the sufficient conditions which make the stochastic neural networks system exponential mean square stable are proposed and proved. All results are expressed in terms of linear matrix inequalities (LMIs). Example and simulation are presented to show the effectiveness of the proposed method.


2007 ◽  
Vol 17 (03) ◽  
pp. 207-218 ◽  
Author(s):  
BAOYONG ZHANG ◽  
SHENGYUAN XU ◽  
YONGMIN LI

This paper considers the problem of robust exponential stability for a class of recurrent neural networks with time-varying delays and parameter uncertainties. The time delays are not necessarily differentiable and the uncertainties are assumed to be time-varying but norm-bounded. Sufficient conditions, which guarantee that the concerned uncertain delayed neural network is robustly, globally, exponentially stable for all admissible parameter uncertainties, are obtained under a weak assumption on the neuron activation functions. These conditions are dependent on the size of the time delay and expressed in terms of linear matrix inequalities. Numerical examples are provided to demonstrate the effectiveness and less conservatism of the proposed stability results.


Sign in / Sign up

Export Citation Format

Share Document