Multistability of Delayed Recurrent Neural Networks with Mexican Hat Activation Functions

2017 ◽  
Vol 29 (2) ◽  
pp. 423-457 ◽  
Author(s):  
Peng Liu ◽  
Zhigang Zeng ◽  
Jun Wang

This letter studies the multistability analysis of delayed recurrent neural networks with Mexican hat activation function. Some sufficient conditions are obtained to ensure that an [Formula: see text]-dimensional recurrent neural network can have [Formula: see text] equilibrium points with [Formula: see text], and [Formula: see text] of them are locally exponentially stable. Furthermore, the attraction basins of these stable equilibrium points are estimated. We show that the attraction basins of these stable equilibrium points can be larger than their originally partitioned subsets. The results of this letter improve and extend the existing stability results in the literature. Finally, a numerical example containing different cases is given to illustrate the theoretical results.

2007 ◽  
Vol 17 (03) ◽  
pp. 207-218 ◽  
Author(s):  
BAOYONG ZHANG ◽  
SHENGYUAN XU ◽  
YONGMIN LI

This paper considers the problem of robust exponential stability for a class of recurrent neural networks with time-varying delays and parameter uncertainties. The time delays are not necessarily differentiable and the uncertainties are assumed to be time-varying but norm-bounded. Sufficient conditions, which guarantee that the concerned uncertain delayed neural network is robustly, globally, exponentially stable for all admissible parameter uncertainties, are obtained under a weak assumption on the neuron activation functions. These conditions are dependent on the size of the time delay and expressed in terms of linear matrix inequalities. Numerical examples are provided to demonstrate the effectiveness and less conservatism of the proposed stability results.


1999 ◽  
Vol 09 (02) ◽  
pp. 95-98 ◽  
Author(s):  
ANKE MEYER-BÄSE

This paper is concerned with the asymptotic hyperstability of recurrent neural networks. We derive based on the stability results necessary and sufficient conditions for the network parameters. The results we achieve are more general than those based on Lyapunov methods, since they provide milder constraints on the connection weights than the conventional results and do not suppose symmetry of the weights.


2008 ◽  
Vol 18 (07) ◽  
pp. 2029-2037
Author(s):  
WEI WU ◽  
BAO TONG CUI ◽  
ZHIGANG ZENG

In this paper, the globally exponential stability of recurrent neural networks with continuously distributed delays is investigated. New theoretical results are presented in the presence of external stimuli. It is shown that the recurrent neural network is globally exponentially stable, and the estimated location of the equilibrium point can be obtained. As typical representatives, the Hopfield neural network (HNN) and the cellular neural network (CNN) are examined in detail. Comparison between our results and the previous results admits the improvement of our results.


2006 ◽  
Vol 16 (09) ◽  
pp. 2737-2743 ◽  
Author(s):  
XIAOFAN YANG ◽  
XIAOFENG LIAO ◽  
YUANYAN TANG ◽  
DAVID J. EVANS

This paper addresses qualitative properties of equilibrium points in a class of delayed neural networks. We derive a sufficient condition for the local exponential stability of equilibrium points, and give an estimate on the domains of attraction of locally exponentially stable equilibrium points. Our condition and estimate are formulated in terms of the network parameters, the neurons' activation functions and the associated equilibrium point; hence, they are easily checkable. Another advantage of our results is that they neither depend on monotonicity of the activation functions nor on symmetry of the interconnection matrix. Our work has practical importance in evaluating the performance of the related associative memory. To our knowledge, this is the first time to present an estimate on the domains of attraction of equilibrium points for delayed neural networks.


2013 ◽  
Vol 427-429 ◽  
pp. 2493-2496
Author(s):  
Qi Han ◽  
Qian Xiong ◽  
Chao Liu ◽  
Jun Peng ◽  
Le Peng Song ◽  
...  

In the paper, the region of the number of equilibrium points of every cell in cellular neural networks with negative slope activation function is considered by the relationship between parameters of cellular neural networks. Some conditions are obtained by using the relationship among connection weights. Depending on these sufficient conditions, inputs and outputs of a CNN, the regions of the values of parameters can be obtained. Some numerical simulations are presented to support the effectiveness of the theoretical analysis.


2010 ◽  
Vol 22 (6) ◽  
pp. 1597-1614 ◽  
Author(s):  
Pengsheng Zheng ◽  
Wansheng Tang ◽  
Jianxiong Zhang

A novel m energy functions method is adopted to analyze the retrieval property of continuous-time asymmetric Hopfield neural networks. Sufficient conditions for the local and global asymptotic stability of the network are proposed. Moreover, an efficient systematic procedure for designing asymmetric networks is proposed, and a given set of states can be assigned as locally asymptotically stable equilibrium points. Simulation examples show that the asymmetric network can act as an efficient associative memory, and it is almost free from spurious memory problem.


2019 ◽  
Vol 2019 (1) ◽  
Author(s):  
M. Iswarya ◽  
R. Raja ◽  
G. Rajchakit ◽  
J. Cao ◽  
J. Alzabut ◽  
...  

AbstractIn this work, the exponential stability problem of impulsive recurrent neural networks is investigated; discrete time delay, continuously distributed delay and stochastic noise are simultaneously taken into consideration. In order to guarantee the exponential stability of our considered recurrent neural networks, two distinct types of sufficient conditions are derived on the basis of the Lyapunov functional and coefficient of our given system and also to construct a Lyapunov function for a large scale system a novel graph-theoretic approach is considered, which is derived by utilizing the Lyapunov functional as well as graph theory. In this approach a global Lyapunov functional is constructed which is more related to the topological structure of the given system. We present a numerical example and simulation figures to show the effectiveness of our proposed work.


2003 ◽  
Vol 15 (8) ◽  
pp. 1897-1929 ◽  
Author(s):  
Barbara Hammer ◽  
Peter Tiňo

Recent experimental studies indicate that recurrent neural networks initialized with “small” weights are inherently biased toward definite memory machines (Tiňno, Čerňanský, & Beňušková, 2002a, 2002b). This article establishes a theoretical counterpart: transition function of recurrent network with small weights and squashing activation function is a contraction. We prove that recurrent networks with contractive transition function can be approximated arbitrarily well on input sequences of unbounded length by a definite memory machine. Conversely, every definite memory machine can be simulated by a recurrent network with contractive transition function. Hence, initialization with small weights induces an architectural bias into learning with recurrent neural networks. This bias might have benefits from the point of view of statistical learning theory: it emphasizes one possible region of the weight space where generalization ability can be formally proved. It is well known that standard recurrent neural networks are not distribution independent learnable in the probably approximately correct (PAC) sense if arbitrary precision and inputs are considered. We prove that recurrent networks with contractive transition function with a fixed contraction parameter fulfill the so-called distribution independent uniform convergence of empirical distances property and hence, unlike general recurrent networks, are distribution independent PAC learnable.


2007 ◽  
Vol 19 (8) ◽  
pp. 2149-2182 ◽  
Author(s):  
Zhigang Zeng ◽  
Jun Wang

In this letter, some sufficient conditions are obtained to guarantee recurrent neural networks with linear saturation activation functions, and time-varying delays have multiequilibria located in the saturation region and the boundaries of the saturation region. These results on pattern characterization are used to analyze and design autoassociative memories, which are directly based on the parameters of the neural networks. Moreover, a formula for the numbers of spurious equilibria is also derived. Four design procedures for recurrent neural networks with linear saturation activation functions and time-varying delays are developed based on stability results. Two of these procedures allow the neural network to be capable of learning and forgetting. Finally, simulation results demonstrate the validity and characteristics of the proposed approach.


Sign in / Sign up

Export Citation Format

Share Document