scholarly journals Local stability conditions for discrete-time cascade locally recurrent neural networks

Author(s):  
Krzysztof Patan

Local stability conditions for discrete-time cascade locally recurrent neural networksThe paper deals with a specific kind of discrete-time recurrent neural network designed with dynamic neuron models. Dynamics are reproduced within each single neuron, hence the network considered is a locally recurrent globally feedforward. A crucial problem with neural networks of the dynamic type is stability as well as stabilization in learning problems. The paper formulates local stability conditions for the analysed class of neural networks using Lyapunov's first method. Moreover, a stabilization problem is defined and solved as a constrained optimization task. In order to tackle this problem, a gradient projection method is adopted. The efficiency and usefulness of the proposed approach are justified by using a number of experiments.

2009 ◽  
pp. 104-122 ◽  
Author(s):  
Mitsuo Yoshida ◽  
Takehiro Mori

Global stability analysis for complex-valued artificial recurrent neural networks seems to be one of yet-unchallenged topics in information science. This chapter presents global stability conditions for discrete-time and continuous- time complex-valued recurrent neural networks, which are regarded as nonlinear dynamical systems. Global asymptotic stability conditions for these networks are derived by way of suitable choices of activation functions. According to these stability conditions, there are classes of discrete-time and continuous-time complex-valued recurrent neural networks whose equilibrium point is globally asymptotically stable. Furthermore, the conditions are shown to be successfully applicable to solving convex programming problems, for which real field solution methods are generally tedious.


1999 ◽  
Vol 10 (2) ◽  
pp. 253-271 ◽  
Author(s):  
P. Campolucci ◽  
A. Uncini ◽  
F. Piazza ◽  
B.D. Rao

Sign in / Sign up

Export Citation Format

Share Document