Dynamical behavior of recurrent neural networks with different external inputs

Author(s):  
Houssem Achouri ◽  
Chaouki Aouiti

The main aim of this paper is to study the dynamics of a recurrent neural networks with different input currents in terms of asymptotic point. Under certain circumstances, we studied the existence, the uniqueness of bounded solutions and their homoclinic and heteroclinic motions of the considered system with rectangular currents input. Moreover, we studied the unpredictable behavior of the continuous high-order recurrent neural networks and the discrete high-order recurrent neural networks. Our method was primarily based on Banach’s fixed-point theorem, topology of uniform convergence on compact sets and Gronwall inequality. For the demonstration of theoretical results, we give examples and their numerical simulations.

Fractals ◽  
2021 ◽  
pp. 2140029
Author(s):  
CHUAN-YUN GU ◽  
FENG-XIA ZHENG ◽  
BABAK SHIRI

A class of tempered fractional neural networks is proposed in this paper. Stability conditions for tempered fractional neural networks are provided by using Banach fixed point theorem. Attractivity and Mittag-Leffler stability are given. In order to show the efficiency and convenience of the method used, tempered fractional neural networks with and without delay are discussed, respectively. Furthermore, short memory and variable-order tempered fractional neural networks are proposed under the global conditions. Finally, two numerical examples are used to demonstrate the theoretical results.


2018 ◽  
Vol 321 ◽  
pp. 296-307 ◽  
Author(s):  
Chaouki Aouiti ◽  
Mohammed Salah M’hamdi ◽  
Farouk Chérif ◽  
Adel M. Alimi

1994 ◽  
Vol 05 (03) ◽  
pp. 241-252 ◽  
Author(s):  
ERIC GOLES ◽  
MARTÍN MATAMALA

We present dynamical results concerning neural networks with high order arguments. More precisely, we study the family of block-sequential iteration of neural networks with polynomial arguments. In this context, we prove that, under a symmetric hypothesis, the sequential iteration is the only one of this family to converge to fixed points. The other iteration modes present a highly complex dynamical behavior: non-bounded cycles and simulation of arbitrary non-symmetric linear neural network.5 We also study a high order memory iteration scheme which accepts an energy functional and bounded cycles in the size of the memory steps.


2018 ◽  
Vol 2018 ◽  
pp. 1-11 ◽  
Author(s):  
Tianyu Wang ◽  
Quanxin Zhu ◽  
Jingwei Cai

We are interested in a class of stochastic fuzzy recurrent neural networks with multiproportional delays and distributed delays. By constructing suitable Lyapunov-Krasovskii functionals and applying stochastic analysis theory, Ito^’s formula and Dynkin’s formula, we derive novel sufficient conditions for mean-square exponential input-to-state stability of the suggested system. Some remarks and discussions are given to show that our results extend and improve some previous results in the literature. Finally, two examples and their simulations are provided to illustrate the effectiveness of the theoretical results.


Sign in / Sign up

Export Citation Format

Share Document