1-D MAPS, CHAOS AND NEURAL NETWORKS FOR INFORMATION PROCESSING

1996 ◽  
Vol 06 (04) ◽  
pp. 627-646 ◽  
Author(s):  
YU.V. ANDREYEV ◽  
A.S. DMITRIEV ◽  
D.A. KUMINOV ◽  
L.O. CHUA ◽  
C.W. WU

An application of complex dynamics and chaos in neural networks to information processing is studied. Mathematical models based on piecewise-linear maps implementing basic functions of information processing via complex dynamics and chaos are discussed. Realizations of these models by neural networks are presented. In contrast to other methods of using neural networks and associative memory to store information, the information is stored in dynamical attractors such as limit cycles, rather than equilibrium points. Retrieval of information corresponds to getting the state into the basin of attraction of the attractor. We show that noise-corrupted information or partial information are sufficient to drive the state into the basin of attraction of the attractor, thus these systems exhibit the property of associative memory.

Complexity ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-18
Author(s):  
Yélomè Judicaël Fernando Kpomahou ◽  
Laurent Amoussou Hinvi ◽  
Joseph Adébiyi Adéchinan ◽  
Clément Hodévèwan Miwadinou

In this paper, chaotic dynamics of a mixed Rayleigh–Liénard oscillator driven by parametric periodic damping and external excitations is investigated analytically and numerically. The equilibrium points and their stability evolutions are analytically analyzed, and the transitions of dynamical behaviors are explored in detail. Furthermore, from the Melnikov method, the analytical criterion for the appearance of the homoclinic chaos is derived. Analytical prediction is tested against numerical simulations based on the basin of attraction of initial conditions. As a result, it is found that for ω = ν , the chaotic region decreases and disappears when the amplitude of the parametric periodic damping excitation increases. Moreover, increasing of F 1 and F 0 provokes an erosion of the basin of attraction and a modification of the geometrical shape of the chaotic attractors. For ω ≠ ν and η = 0.8 , the fractality of the basin of attraction increases as the amplitude of the external periodic excitation and constant term increase. Bifurcation structures of our system are performed through the fourth-order Runge–Kutta ode 45 algorithm. It is found that the system displays a remarkable route to chaos. It is also found that the system exhibits monostable and bistable oscillations as well as the phenomenon of coexistence of attractors.


2020 ◽  
Vol 30 (05) ◽  
pp. 2050072 ◽  
Author(s):  
Yingjuan Yang ◽  
Guoyuan Qi ◽  
Jianbing Hu ◽  
Philippe Faradja

A method for finding hidden chaotic attractors in the plasma system is presented. Using the Routh–Hurwitz criterion, the stability distribution associated with two parameters is identified to find the region around the equilibrium points of the stable nodes, stable focus-nodes, saddles and saddle-foci for the purpose of investigating hidden chaos. A physical interpretation is provided of the stability distribution for each type of equilibrium point. The basin of attraction and parameter region of hidden chaos are identified by excluding the self-excited chaotic attractors of all equilibrium points. Homotopy and numerical continuation are also employed to check whether the basin of chaotic attraction intersects with the neighborhood of a saddle equilibrium. Bifurcation analysis, phase portrait analysis, and basins of different dynamical attraction are used as tools to distinguish visually the self-excited chaotic attractor and hidden chaotic attractor. The Casimir power reflects the error power between the dissipative energy and the energy supplied by the whistler field. It explains physically, analytically, and numerically the conditions that generate the different dynamics, such as sinks, periodic orbits, and chaos.


2000 ◽  
Vol 12 (2) ◽  
pp. 451-472 ◽  
Author(s):  
Fation Sevrani ◽  
Kennichi Abe

In this article we present techniques for designing associative memories to be implemented by a class of synchronous discrete-time neural networks based on a generalization of the brain-state-in-a-box neural model. First, we address the local qualitative properties and global qualitative aspects of the class of neural networks considered. Our approach to the stability analysis of the equilibrium points of the network gives insight into the extent of the domain of attraction for the patterns to be stored as asymptotically stable equilibrium points and is useful in the analysis of the retrieval performance of the network and also for design purposes. By making use of the analysis results as constraints, the design for associative memory is performed by solving a constraint optimization problem whereby each of the stored patterns is guaranteed a substantial domain of attraction. The performance of the designed network is illustrated by means of three specific examples.


2011 ◽  
Vol 21 (01) ◽  
pp. 237-254 ◽  
Author(s):  
LINGHONG LU ◽  
RODERICK EDWARDS

Gene-regulatory networks are potentially capable of more complex behavior than convergence to a stationary state, or even cycling through a simple sequence of expression patterns. The analysis of qualitative dynamics for these networks is facilitated by using piecewise-linear equations and its state transition diagram (an n-dimensional hypercube, in the case of n genes with a single effective threshold for the protein product of each). Our previous work has dealt with cycles of states in the state transition diagram that allow periodic solutions. Here, we study a particular kind of figure-8 pattern in the state transition diagram and determine conditions that allow complex behavior. Previous studies of complex behavior, such as chaos, in such networks have dealt only with specific examples. Our approach allows an appreciation of the design principles that give rise to complex dynamics, which may have application in assessing the dynamical possibilities of gene networks with poorly known parameters, or for synthesis or control of gene networks with complex behavior.


2021 ◽  
Vol 31 (16) ◽  
Author(s):  
M. D. Vijayakumar ◽  
Alireza Bahramian ◽  
Hayder Natiq ◽  
Karthikeyan Rajagopal ◽  
Iqtadar Hussain

Hidden attractors generated by the interactions of dynamical variables may have no equilibrium point in their basin of attraction. They have grabbed the attention of mathematicians who investigate strange attractors. Besides, quadratic hyperjerk systems are under the magnifying glass of these mathematicians because of their elegant structures. In this paper, a quadratic hyperjerk system is introduced that can generate chaotic attractors. The dynamical behaviors of the oscillator are investigated by plotting their Lyapunov exponents and bifurcation diagrams. The multistability of the hyperjerk system is investigated using the basin of attraction. It is revealed that the system is bistable when one of its attractors is hidden. Besides, the complexity of the systems’ attractors is investigated using sample entropy as the complexity feature. It is revealed how changing the parameters can affect the complexity of the systems’ time series. In addition, one of the hyperjerk system equilibrium points is stabilized using impulsive control. All real initial conditions become the equilibrium points of the basin of attraction using the stabilizing method.


2009 ◽  
Vol 21 (5) ◽  
pp. 1434-1458 ◽  
Author(s):  
Xuemei Li

This letter discusses the complete stability of discrete-time cellular neural networks with piecewise linear output functions. Under the assumption of certain symmetry on the feedback matrix, a sufficient condition of complete stability is derived by finite trajectory length. Because the symmetric conditions are not robust, the complete stability of networks may be lost under sufficiently small perturbations. The robust conditions of complete stability are also given for discrete-time cellular neural networks with multiple equilibrium points and a unique equilibrium point. These complete stability results are robust and available.


1997 ◽  
Vol 9 (2) ◽  
pp. 319-336 ◽  
Author(s):  
K. Pakdaman ◽  
C. P. Malta ◽  
C. Grotta-Ragazzo ◽  
J.-F. Vibert

Little attention has been paid in the past to the effects of interunit transmission delays (representing a xonal and synaptic delays) ontheboundary of the basin of attraction of stable equilibrium points in neural networks. As a first step toward a better understanding of the influence of delay, we study the dynamics of a single graded-response neuron with a delayed excitatory self-connection. The behavior of this system is representative of that of a family of networks composed of graded-response neurons in which most trajectories converge to stable equilibrium points for any delay value. It is shown that changing the delay modifies the “location” of the boundary of the basin of attraction of the stable equilibrium points without affecting the stability of the equilibria. The dynamics of trajectories on the boundary are also delay dependent and influence the transient regime of trajectories within the adjacent basins. Our results suggest that when dealing with networks with delay, it is important to study not only the effect of the delay on the asymptotic convergence of the system but also on the boundary of the basins of attraction of the equilibria.


1996 ◽  
Vol 07 (04) ◽  
pp. 463-483 ◽  
Author(s):  
E. KORUTCHEVA ◽  
K. KOROUTCHEV

In this paper a simple two-layer neural network's model, similar to that studied by D. Amit and N. Brunel,11 is investigated in the frames of the mean-field approximation. The distributions of the local fields are analytically derived and compared to those obtained in Ref. 11. The dynamic properties are discussed and the basin of attraction in some parametric space is found. A procedure for driving the system into a basin of attraction by using a regulation imposed on the network is proposed. The effect of outer stimulus is shown to have a destructive influence on the attractor, forcing the latter to disappear if the distribution of the stimulus has high enough variance or if the stimulus has a spatial structure with sufficient contrast. The techniques, used in this paper, for obtaining the analytical results can be applied to more complex topologies of linked recurrent neural networks.


Sign in / Sign up

Export Citation Format

Share Document