scholarly journals A New Globally Exponential Stability Criterion for Neural Networks with Discrete and Distributed Delays

2015 ◽  
Vol 2015 ◽  
pp. 1-9 ◽  
Author(s):  
Hao Chen ◽  
Shouming Zhong ◽  
Jinxiang Yang

This paper concerns the problem of the globally exponential stability of neural networks with discrete and distributed delays. A novel criterion for the globally exponential stability of neural networks is derived by employing the Lyapunov stability theory, homomorphic mapping theory, and matrix theory. The proposed result improves the previously reported global stability results. Finally, two illustrative numerical examples are given to show the effectiveness of our results.

2013 ◽  
Vol 2013 ◽  
pp. 1-10 ◽  
Author(s):  
Yajun Li

An innovative stability analysis approach for a class of discrete-time stochastic neural networks (DSNNs) with time-varying delays is developed. By constructing a novel piecewise Lyapunov-Krasovskii functional candidate, a new sum inequality is presented to deal with sum items without ignoring any useful items, the model transformation is no longer needed, and the free weighting matrices are added to reduce the conservatism in the derivation of our results, so the improvement of computational efficiency can be expected. Numerical examples and simulations are also given to show the effectiveness and less conservatism of the proposed criteria.


2008 ◽  
Vol 18 (07) ◽  
pp. 2029-2037
Author(s):  
WEI WU ◽  
BAO TONG CUI ◽  
ZHIGANG ZENG

In this paper, the globally exponential stability of recurrent neural networks with continuously distributed delays is investigated. New theoretical results are presented in the presence of external stimuli. It is shown that the recurrent neural network is globally exponentially stable, and the estimated location of the equilibrium point can be obtained. As typical representatives, the Hopfield neural network (HNN) and the cellular neural network (CNN) are examined in detail. Comparison between our results and the previous results admits the improvement of our results.


Author(s):  
X Liu ◽  
J Cao

In this paper, the anti-periodic solutions are considered for generalized neural networks with multiple discrete delays and distributed delays. Several new sufficient conditions are established for ensuring the existence and exponential stability of anti-periodic solutions based on the Lyapunov method and M-matrix theory. It is shown that, by means of the techniques developed, the analysis of stability for anti-periodic solutions is different from the familiar periodic ones. The obtained results generalize and improve the earlier works. Two numerical examples are given to illustrate the effectiveness of the proposed theories.


2010 ◽  
Vol 2010 ◽  
pp. 1-14 ◽  
Author(s):  
Choon Ki Ahn

A new robust training law, which is called an input/output-to-state stable training law (IOSSTL), is proposed for dynamic neural networks with external disturbance. Based on linear matrix inequality (LMI) formulation, the IOSSTL is presented to not only guarantee exponential stability but also reduce the effect of an external disturbance. It is shown that the IOSSTL can be obtained by solving the LMI, which can be easily facilitated by using some standard numerical packages. Numerical examples are presented to demonstrate the validity of the proposed IOSSTL.


Sign in / Sign up

Export Citation Format

Share Document