scholarly journals Ultimate Boundedness of Discrete-Time Uncertain Neural Networks with Leakage and Time-Varying Delays

Author(s):  
Jiajun Hua ◽  
Danhua He

In this paper, by using the general discrete Halanay inequalities, the techniques of inequalities and some other properties, we study the ultimate boundedness of a class of the discrete-time uncertain neural network systems and obtain several sufficient conditions to ensure the ultimate boundedness of discrete-time uncertain neural networks with leakage and time-varying delays. Finally numerical examples are given to verify the correctness of the conclusion.

2014 ◽  
Vol 2014 ◽  
pp. 1-10 ◽  
Author(s):  
O. M. Kwon ◽  
M. J. Park ◽  
Ju H. Park ◽  
S. M. Lee ◽  
E. J. Cha

The problem of passivity analysis for neural networks with time-varying delays and parameter uncertainties is considered. By the consideration of newly constructed Lyapunov-Krasovskii functionals, improved sufficient conditions to guarantee the passivity of the concerned networks are proposed with the framework of linear matrix inequalities (LMIs), which can be solved easily by various efficient convex optimization algorithms. The enhancement of the feasible region of the proposed criteria is shown via two numerical examples by the comparison of maximum allowable delay bounds.


2016 ◽  
Vol 30 (18) ◽  
pp. 1650271 ◽  
Author(s):  
Xueli Cui ◽  
Yongguang Yu ◽  
Hu Wang ◽  
Wei Hu

In this paper, the memristor-based fractional-order neural networks with time delay are analyzed. Based on the theories of set-value maps, differential inclusions and Filippov’s solution, some sufficient conditions for asymptotic stability of this neural network model are obtained when the external inputs are constants. Besides, uniform stability condition is derived when the external inputs are time-varying, and its attractive interval is estimated. Finally, numerical examples are given to verify our results.


2015 ◽  
Vol 2015 ◽  
pp. 1-11 ◽  
Author(s):  
M. J. Park ◽  
O. M. Kwon ◽  
E. J. Cha

This paper deals with the problem of stability analysis for generalized neural networks with time-varying delays. With a suitable Lyapunov-Krasovskii functional (LKF) and Wirtinger-based integral inequality, sufficient conditions for guaranteeing the asymptotic stability of the concerned networks are derived in terms of linear matrix inequalities (LMIs). By applying the proposed methods to two numerical examples which have been utilized in many works for checking the conservatism of stability criteria, it is shown that the obtained results are significantly improved comparing with the previous ones published in other literature.


2011 ◽  
Vol 204-210 ◽  
pp. 1549-1552
Author(s):  
Li Wan ◽  
Qing Hua Zhou

Although ultimate boundedness of several classes of neural networks with constant delays was studied by some researchers, the inherent randomness associated with signal transmission was not taken account into these networks. At present, few authors study ultimate boundedness of stochastic neural networks and no related papers are reported. In this paper, by using Lyapunov functional and linear matrix inequality, some sufficient conditions ensuring the ultimate boundedness of stochastic neural networks with time-varying delays are established. Our criteria are easily tested by Matlab LMI Toolbox. One example is given to demonstrate our criteria.


2015 ◽  
Vol 2015 ◽  
pp. 1-13 ◽  
Author(s):  
Liyuan Hou ◽  
Hong Zhu

This paper investigates the stability of stochastic discrete-time neural networks (NNs) with discrete time-varying delays and leakage delay. As the partition of time-varying and leakage delay is brought in the discrete-time system, we construct a novel Lyapunov-Krasovskii function based on stability theory. Furthermore sufficient conditions are derived to guarantee the global asymptotic stability of the equilibrium point. Numerical example is given to demonstrate the effectiveness of the proposed method and the applicability of the proposed method.


2009 ◽  
Vol 19 (04) ◽  
pp. 269-283 ◽  
Author(s):  
TAO LI ◽  
AIGUO SONG ◽  
SHUMIN FEI

This paper investigates robust exponential stability for discrete-time recurrent neural networks with both time-varying delay (0 ≤ τm ≤ τ(k) ≤ τM) and distributed one. Through partitioning delay intervals [0,τm] and [τm,τM], respectively, and choosing an augmented Lyapunov-Krasovskii functional, the delay-dependent sufficient conditions are obtained by using free-weighting matrix and convex combination methods. These criteria are presented in terms of linear matrix inequalities (LMIs) and their feasibility can be easily checked by resorting to LMI in Matlab Toolbox in Ref. 1. The activation functions are not required to be differentiable or strictly monotonic, which generalizes those earlier forms. As an extension, we further consider the robust stability of discrete-time delayed Cohen-Grossberg neural networks. Finally, the effectiveness of the proposed results is further illustrated by three numerical examples in comparison with the reported ones.


2009 ◽  
Vol 14 (3) ◽  
pp. 283-301 ◽  
Author(s):  
S. Abbas

In this paper we discuss the existence and uniqueness of a k-pseudo almost periodic sequence solutions of a discrete time neural network. We give several sufficient conditions for the exponential and global attractivity of the solution.


2013 ◽  
Vol 2013 ◽  
pp. 1-10 ◽  
Author(s):  
Yajun Li

An innovative stability analysis approach for a class of discrete-time stochastic neural networks (DSNNs) with time-varying delays is developed. By constructing a novel piecewise Lyapunov-Krasovskii functional candidate, a new sum inequality is presented to deal with sum items without ignoring any useful items, the model transformation is no longer needed, and the free weighting matrices are added to reduce the conservatism in the derivation of our results, so the improvement of computational efficiency can be expected. Numerical examples and simulations are also given to show the effectiveness and less conservatism of the proposed criteria.


2007 ◽  
Vol 17 (05) ◽  
pp. 407-417 ◽  
Author(s):  
QIANKUN SONG ◽  
JINDE CAO

In this paper, the impulsive Cohen-Grossberg neural network with unbounded discrete time-varying delays is considered. By using the analysis method and inequality technique, several sufficient conditions are obtained to ensure the global exponential stability of the addressed neural network. These results generalize the existing relevant stability results. Two examples with simulations are given to show the effectiveness of the obtained results.


Sign in / Sign up

Export Citation Format

Share Document