ALMOST SURE EXPONENTIAL STABILITY OF STOCHASTIC RECURRENT NEURAL NETWORKS WITH TIME-VARYING DELAYS

2010 ◽  
Vol 20 (02) ◽  
pp. 539-544 ◽  
Author(s):  
LI WAN ◽  
QINGHUA ZHOU

The stability of stochastic recurrent neural networks with time-varying delays is investigated. A set of novel sufficient conditions on almost sure exponential stability has been established. Two examples are also given to illustrate the effectiveness of our results.

2017 ◽  
Vol 10 (02) ◽  
pp. 1750027 ◽  
Author(s):  
Wei Zhang ◽  
Chuandong Li ◽  
Tingwen Huang

In this paper, the stability and periodicity of memristor-based neural networks with time-varying delays are studied. Based on linear matrix inequalities, differential inclusion theory and by constructing proper Lyapunov functional approach and using linear matrix inequality, some sufficient conditions are obtained for the global exponential stability and periodic solutions of memristor-based neural networks. Finally, two illustrative examples are given to demonstrate the results.


2007 ◽  
Vol 17 (03) ◽  
pp. 207-218 ◽  
Author(s):  
BAOYONG ZHANG ◽  
SHENGYUAN XU ◽  
YONGMIN LI

This paper considers the problem of robust exponential stability for a class of recurrent neural networks with time-varying delays and parameter uncertainties. The time delays are not necessarily differentiable and the uncertainties are assumed to be time-varying but norm-bounded. Sufficient conditions, which guarantee that the concerned uncertain delayed neural network is robustly, globally, exponentially stable for all admissible parameter uncertainties, are obtained under a weak assumption on the neuron activation functions. These conditions are dependent on the size of the time delay and expressed in terms of linear matrix inequalities. Numerical examples are provided to demonstrate the effectiveness and less conservatism of the proposed stability results.


2011 ◽  
Vol 2011 ◽  
pp. 1-16 ◽  
Author(s):  
Chuangxia Huang ◽  
Xinsong Yang ◽  
Yigang He ◽  
Lehua Huang

Stability of reaction-diffusion recurrent neural networks (RNNs) with continuously distributed delays and stochastic influence are considered. Some new sufficient conditions to guarantee the almost sure exponential stability and mean square exponential stability of an equilibrium solution are obtained, respectively. Lyapunov's functional method, M-matrix properties, some inequality technique, and nonnegative semimartingale convergence theorem are used in our approach. The obtained conclusions improve some published results.


2011 ◽  
Vol 2011 ◽  
pp. 1-23
Author(s):  
R. Raja ◽  
R. Sakthivel ◽  
S. Marshal Anthoni

This paper deals with the stability analysis problem for a class of discrete-time stochastic BAM neural networks with discrete and distributed time-varying delays. By constructing a suitable Lyapunov-Krasovskii functional and employing M-matrix theory, we find some sufficient conditions ensuring the global exponential stability of the equilibrium point for stochastic BAM neural networks with time-varying delays. The conditions obtained here are expressed in terms of LMIs whose feasibility can be easily checked by MATLAB LMI Control toolbox. A numerical example is presented to show the effectiveness of the derived LMI-based stability conditions.


2011 ◽  
Vol 2011 ◽  
pp. 1-17 ◽  
Author(s):  
Chuangxia Huang ◽  
Jinde Cao

This paper is devoted to the study of the stochastic stability of a class of Cohen-Grossberg neural networks, in which the interconnections and delays are time-varying. With the help of Lyapunov function, Burkholder-Davids-Gundy inequality, and Borel-Cantell's theory, a set of novel sufficient conditions onpth moment exponential stability and almost sure exponential stability for the trivial solution of the system is derived. Compared with the previous published results, our method does not resort to the Razumikhin-type theorem and the semimartingale convergence theorem. Results of the development as presented in this paper are more general than those reported in some previously published papers. An illustrative example is also given to show the effectiveness of the obtained results.


2013 ◽  
Vol 2013 ◽  
pp. 1-11 ◽  
Author(s):  
Shifang Kuang ◽  
Yunjian Peng ◽  
Feiqi Deng ◽  
Wenhua Gao

Exponential stability in mean square of stochastic delay recurrent neural networks is investigated in detail. By using Itô’s formula and inequality techniques, the sufficient conditions to guarantee the exponential stability in mean square of an equilibrium are given. Under the conditions which guarantee the stability of the analytical solution, the Euler-Maruyama scheme and the split-step backward Euler scheme are proved to be mean-square stable. At last, an example is given to demonstrate our results.


Mathematics ◽  
2018 ◽  
Vol 6 (9) ◽  
pp. 144 ◽  
Author(s):  
Ravi Agarwal ◽  
Snezhana Hristova ◽  
Donal O’Regan ◽  
Peter Kopanov

The Cohen and Grossberg neural networks model is studied in the case when the neurons are subject to a certain impulsive state displacement at random exponentially-distributed moments. These types of impulses significantly change the behavior of the solutions from a deterministic one to a stochastic process. We examine the stability of the equilibrium of the model. Some sufficient conditions for the mean-square exponential stability and mean exponential stability of the equilibrium of general neural networks are obtained in the case of the time-varying potential (or voltage) of the cells, with time-dependent amplification functions and behaved functions, as well as time-varying strengths of connectivity between cells and variable external bias or input from outside the network to the units. These sufficient conditions are explicitly expressed in terms of the parameters of the system, and hence, they are easily verifiable. The theory relies on a modification of the direct Lyapunov method. We illustrate our theory on a particular nonlinear neural network.


2007 ◽  
Vol 17 (09) ◽  
pp. 3099-3108 ◽  
Author(s):  
QINGHUA ZHOU ◽  
LI WAN ◽  
JIANHUA SUN

Exponential stability of reaction–diffusion fuzzy recurrent neural networks (RDFRNNs) with time-varying delays are considered. By using the method of variational parameters, M-matrix properties and inequality technique, some delay-independent or delay-dependent sufficient conditions for guaranteeing the exponential stability of an equilibrium solution are obtained. One example is given to demonstrate the theoretical results.


2007 ◽  
Vol 17 (09) ◽  
pp. 3219-3227 ◽  
Author(s):  
LI WAN ◽  
QINGHUA ZHOU ◽  
JIANHUA SUN

Stochastic effects on the stability property of reaction–diffusion generalized Cohen–Grossberg neural networks (GDCGNNs) with time-varying delay are considered. By skillfully constructing suitable Lyapunov functionals and employing the method of variational parameters, inequality technique and stochastic analysis, the delay independent and easily verifiable sufficient conditions to guarantee the mean-value exponential stability of an equilibrium solution associated with temporally uniform external inputs to the networks are obtained. One example is given to illustrate the theoretical results.


Sign in / Sign up

Export Citation Format

Share Document