scholarly journals Mean-Square Exponential Input-to-State Stability of Stochastic Fuzzy Recurrent Neural Networks with Multiproportional Delays and Distributed Delays

2018 ◽  
Vol 2018 ◽  
pp. 1-11 ◽  
Author(s):  
Tianyu Wang ◽  
Quanxin Zhu ◽  
Jingwei Cai

We are interested in a class of stochastic fuzzy recurrent neural networks with multiproportional delays and distributed delays. By constructing suitable Lyapunov-Krasovskii functionals and applying stochastic analysis theory, Ito^’s formula and Dynkin’s formula, we derive novel sufficient conditions for mean-square exponential input-to-state stability of the suggested system. Some remarks and discussions are given to show that our results extend and improve some previous results in the literature. Finally, two examples and their simulations are provided to illustrate the effectiveness of the theoretical results.

2019 ◽  
Vol 2019 ◽  
pp. 1-15
Author(s):  
Tianqing Yang ◽  
Zuoliang Xiong ◽  
Cuiping Yang

This paper is concerned with the mean-square exponential input-to-state stability problem for a class of stochastic Cohen-Grossberg neural networks. Different from prior works, neutral terms and mixed delays are discussed in our system. By employing the Lyapunov-Krasovskii functional method, Itô formula, Dynkin formula, and stochastic analysis theory, we obtain some novel sufficient conditions to ensure that the addressed system is mean-square exponentially input-to-state stable. Moreover, two numerical examples and their simulations are given to illustrate the correctness of the theoretical results.


2011 ◽  
Vol 2011 ◽  
pp. 1-16 ◽  
Author(s):  
Chuangxia Huang ◽  
Xinsong Yang ◽  
Yigang He ◽  
Lehua Huang

Stability of reaction-diffusion recurrent neural networks (RNNs) with continuously distributed delays and stochastic influence are considered. Some new sufficient conditions to guarantee the almost sure exponential stability and mean square exponential stability of an equilibrium solution are obtained, respectively. Lyapunov's functional method, M-matrix properties, some inequality technique, and nonnegative semimartingale convergence theorem are used in our approach. The obtained conclusions improve some published results.


2008 ◽  
Vol 18 (07) ◽  
pp. 2029-2037
Author(s):  
WEI WU ◽  
BAO TONG CUI ◽  
ZHIGANG ZENG

In this paper, the globally exponential stability of recurrent neural networks with continuously distributed delays is investigated. New theoretical results are presented in the presence of external stimuli. It is shown that the recurrent neural network is globally exponentially stable, and the estimated location of the equilibrium point can be obtained. As typical representatives, the Hopfield neural network (HNN) and the cellular neural network (CNN) are examined in detail. Comparison between our results and the previous results admits the improvement of our results.


2011 ◽  
Vol 2011 ◽  
pp. 1-13 ◽  
Author(s):  
Shujie Yang ◽  
Bao Shi ◽  
Mo Li

Based on Lyapunov-Krasovskii functional method and stochastic analysis theory, we obtain some new delay-dependent criteria ensuring mean square stability of a class of impulsive stochastic equations. Numerical examples are given to illustrate the effectiveness of the theoretical results.


2013 ◽  
Vol 2013 ◽  
pp. 1-11 ◽  
Author(s):  
Shifang Kuang ◽  
Yunjian Peng ◽  
Feiqi Deng ◽  
Wenhua Gao

Exponential stability in mean square of stochastic delay recurrent neural networks is investigated in detail. By using Itô’s formula and inequality techniques, the sufficient conditions to guarantee the exponential stability in mean square of an equilibrium are given. Under the conditions which guarantee the stability of the analytical solution, the Euler-Maruyama scheme and the split-step backward Euler scheme are proved to be mean-square stable. At last, an example is given to demonstrate our results.


2007 ◽  
Vol 17 (09) ◽  
pp. 3099-3108 ◽  
Author(s):  
QINGHUA ZHOU ◽  
LI WAN ◽  
JIANHUA SUN

Exponential stability of reaction–diffusion fuzzy recurrent neural networks (RDFRNNs) with time-varying delays are considered. By using the method of variational parameters, M-matrix properties and inequality technique, some delay-independent or delay-dependent sufficient conditions for guaranteeing the exponential stability of an equilibrium solution are obtained. One example is given to demonstrate the theoretical results.


2021 ◽  
Vol 5 (4) ◽  
pp. 260
Author(s):  
Xiao Liu ◽  
Kelin Li ◽  
Qiankun Song ◽  
Xujun Yang

In this paper, the quasi-projective synchronization of distributed-order recurrent neural networks is investigated. Firstly, based on the definition of the distributed-order derivative and metric space theory, two distributed-order differential inequalities are obtained. Then, by employing the Lyapunov method, Laplace transform, Laplace final value theorem, and some inequality techniques, the quasi-projective synchronization sufficient conditions for distributed-order recurrent neural networks are established in cases of feedback control and hybrid control schemes, respectively. Finally, two numerical examples are given to verify the effectiveness of the theoretical results.


2017 ◽  
Vol 2017 ◽  
pp. 1-10 ◽  
Author(s):  
Changjian Wang ◽  
Zuoliang Xiong ◽  
Min Liang ◽  
Hongwei Yin

In this paper, we consider the input-to-stability for a class of stochastic neutral-type memristive neural networks. Neutral terms and S-type distributed delays are taken into account in our system. Using the stochastic analysis theory and Itô formula, we obtain the conditions of mean-square exponential input-to-stability for system. A numerical example is given to illustrate the correctness of our conclusions.


2021 ◽  
Vol 2021 (1) ◽  
Author(s):  
Lihua Dai ◽  
Yuanyuan Hou

AbstractIn this paper, we first consider the stability problem for a class of stochastic quaternion-valued neural networks with time-varying delays. Next, we cannot explicitly decompose the quaternion-valued systems into equivalent real-valued systems; by using Lyapunov functional and stochastic analysis techniques, we can obtain sufficient conditions for mean-square exponential input-to-state stability of the quaternion-valued stochastic neural networks. Our results are completely new. Finally, a numerical example is given to illustrate the feasibility of our results.


Sign in / Sign up

Export Citation Format

Share Document