scholarly journals Mean-square exponential input-to-state stability of stochastic recurrent neural networks with multi-proportional delays

2017 ◽  
Vol 219 ◽  
pp. 396-403 ◽  
Author(s):  
Liqun Zhou ◽  
Xueting Liu
2018 ◽  
Vol 2018 ◽  
pp. 1-11 ◽  
Author(s):  
Tianyu Wang ◽  
Quanxin Zhu ◽  
Jingwei Cai

We are interested in a class of stochastic fuzzy recurrent neural networks with multiproportional delays and distributed delays. By constructing suitable Lyapunov-Krasovskii functionals and applying stochastic analysis theory, Ito^’s formula and Dynkin’s formula, we derive novel sufficient conditions for mean-square exponential input-to-state stability of the suggested system. Some remarks and discussions are given to show that our results extend and improve some previous results in the literature. Finally, two examples and their simulations are provided to illustrate the effectiveness of the theoretical results.


2013 ◽  
Vol 2013 ◽  
pp. 1-11 ◽  
Author(s):  
Shifang Kuang ◽  
Yunjian Peng ◽  
Feiqi Deng ◽  
Wenhua Gao

Exponential stability in mean square of stochastic delay recurrent neural networks is investigated in detail. By using Itô’s formula and inequality techniques, the sufficient conditions to guarantee the exponential stability in mean square of an equilibrium are given. Under the conditions which guarantee the stability of the analytical solution, the Euler-Maruyama scheme and the split-step backward Euler scheme are proved to be mean-square stable. At last, an example is given to demonstrate our results.


Sign in / Sign up

Export Citation Format

Share Document