dynamic attractors
Recently Published Documents


TOTAL DOCUMENTS

17
(FIVE YEARS 5)

H-INDEX

5
(FIVE YEARS 2)

Micromachines ◽  
2021 ◽  
Vol 12 (10) ◽  
pp. 1201
Author(s):  
Fedor Pavlovich Meshchaninov ◽  
Dmitry Alexeevich Zhevnenko ◽  
Vladislav Sergeevich Kozhevnikov ◽  
Evgeniy Sergeevich Shamin ◽  
Oleg Alexandrovich Telminov ◽  
...  

The use of low-dimensional materials is a promising approach to improve the key characteristics of memristors. The development process includes modeling, but the question of the most common compact model applicability to the modeling of device characteristics with the inclusion of low-dimensional materials remains open. In this paper, a comparative analysis of linear and nonlinear drift as well as threshold models was conducted. For this purpose, the assumption of the relationship between the results of the optimization of the volt–ampere characteristic loop and the descriptive ability of the model was used. A global random search algorithm was used to solve the optimization problem, and an error function with the inclusion of a regularizer was developed to estimate the loop features. Based on the characteristic features derived through meta-analysis, synthetic volt–ampere characteristic contours were built and the results of their approximation by different models were compared. For every model, the quality of the threshold voltage estimation was evaluated, the forms of the memristor potential functions and dynamic attractors associated with experimental contours on graphene oxide were calculated.


2021 ◽  
pp. 1-43
Author(s):  
Alfred Rajakumar ◽  
John Rinzel ◽  
Zhe S. Chen

Abstract Recurrent neural networks (RNNs) have been widely used to model sequential neural dynamics (“neural sequences”) of cortical circuits in cognitive and motor tasks. Efforts to incorporate biological constraints and Dale's principle will help elucidate the neural representations and mechanisms of underlying circuits. We trained an excitatory-inhibitory RNN to learn neural sequences in a supervised manner and studied the representations and dynamic attractors of the trained network. The trained RNN was robust to trigger the sequence in response to various input signals and interpolated a time-warped input for sequence representation. Interestingly, a learned sequence can repeat periodically when the RNN evolved beyond the duration of a single sequence. The eigenspectrum of the learned recurrent connectivity matrix with growing or damping modes, together with the RNN's nonlinearity, were adequate to generate a limit cycle attractor. We further examined the stability of dynamic attractors while training the RNN to learn two sequences. Together, our results provide a general framework for understanding neural sequence representation in the excitatory-inhibitory RNN.


Complexity ◽  
2020 ◽  
Vol 2020 ◽  
pp. 1-16
Author(s):  
Qiuzhen Wan ◽  
Zhaoteng Zhou ◽  
Wenkui Ji ◽  
Chunhua Wang ◽  
Fei Yu

In this paper, a novel no-equilibrium 5D memristive hyperchaotic system is proposed, which is achieved by introducing an ideal flux-controlled memristor model and two constant terms into an improved 4D self-excited hyperchaotic system. The system parameters-dependent and memristor initial conditions-dependent dynamical characteristics of the proposed memristive hyperchaotic system are investigated in terms of phase portrait, Lyapunov exponent spectrum, bifurcation diagram, Poincaré map, and time series. Then, the hidden dynamic attractors such as periodic, quasiperiodic, chaotic, and hyperchaotic attractors are found under the variation of its system parameters. Meanwhile, the most striking phenomena of hidden extreme multistability, transient hyperchaotic behavior, and offset boosting control are revealed for appropriate sets of the memristor and other initial conditions. Finally, a hardware electronic circuit is designed, and the experimental results are well consistent with the numerical simulations, which demonstrate the feasibility of this novel 5D memristive hyperchaotic system.


2020 ◽  
Vol 99 (4) ◽  
pp. 3169-3196 ◽  
Author(s):  
Zubaer Ibna Mannan ◽  
Shyam Prasad Adhikari ◽  
Hyongsuk Kim ◽  
Leon Chua

2019 ◽  
Vol 10 (1) ◽  
Author(s):  
Lee Susman ◽  
Naama Brenner ◽  
Omri Barak

Abstract What is the physiological basis of long-term memory? The prevailing view in Neuroscience attributes changes in synaptic efficacy to memory acquisition, implying that stable memories correspond to stable connectivity patterns. However, an increasing body of experimental evidence points to significant, activity-independent fluctuations in synaptic strengths. How memories can survive these fluctuations and the accompanying stabilizing homeostatic mechanisms is a fundamental open question. Here we explore the possibility of memory storage within a global component of network connectivity, while individual connections fluctuate. We find that homeostatic stabilization of fluctuations differentially affects different aspects of network connectivity. Specifically, memories stored as time-varying attractors of neural dynamics are more resilient to erosion than fixed-points. Such dynamic attractors can be learned by biologically plausible learning-rules and support associative retrieval. Our results suggest a link between the properties of learning-rules and those of network-level memory representations, and point at experimentally measurable signatures.


2015 ◽  
Vol 82 (5) ◽  
Author(s):  
Itay Grinberg ◽  
Oleg V. Gendelman

Forced–damped essentially nonlinear oscillators can have a multitude of dynamic attractors. Generically, no analytic procedure is available to reveal all such attractors. For many practical and engineering applications, however, it might be not necessary to know all the attractors in detail. Knowledge of the zone in the state space (or the space of initial conditions), in which all the attractors are situated might be sufficient. We demonstrate that this goal can be achieved by relatively simple means—even for systems with multiple and unknown attractors. More specifically, this paper suggests an analytic procedure to determine the zone in the space of initial conditions, which contains all attractors of the essentially nonlinear forced–damped system for a given set of parameters. The suggested procedure is an extension of well-known Lyapunov functions approach; here we use it for analysis of stability of nonautonomous systems with external forcing. Consequently, instead of the complete state space of the problem, we consider a space of initial conditions and define a bounded trapping region in this space, so that for every initial condition outside this region, the dynamic flow will eventually enter it and will never leave it. This approach is used to find a special closed curve on the plane of initial conditions for a forced–damped strongly nonlinear oscillator with single-degree-of-freedom (single-DOF). Solving the equations of motion is not required. The approach is illustrated by the important benchmark example of x2n potential, including the celebrated Ueda oscillator for n = 2. Another example is the well-known model of forced–damped oscillator with double-well potential. We also demonstrate that the boundary curve, obtained by analytic tools, can be efficiently “tightened” numerically, yielding even stricter estimation for the zone of the existing attractors.


2013 ◽  
Vol 2013 ◽  
pp. 1-7 ◽  
Author(s):  
Milan Guzan

The subject of research in this paper is multiple-valued (MV) memory cell—particularly the morphology of boundary surface of five-valued memory. By accepting the values of parasitic accumulation elements on the chip, very complicated morphology of the boundary surfaces occurs, which separates various attractors from each other. This is due to the occurrence of undesirable oscillations—a stable limit cycles, which makes it impossible to control memory. These dynamic attractors are so dominant that their regions of attraction even surround regions of attraction of static attractors—required logic levels of memory. Therefore, in the realization of the MV memory on the chip is necessary to know the values of the parasitic elements, because their presence may cause a malfunction of the memory. In this case, only calculation and displaying the boundary surface provides exact answers related to operation of the MV memory.


2010 ◽  
Vol 20 (03) ◽  
pp. 869-875
Author(s):  
J. J. TORRES ◽  
S. DE FRANCISCIS ◽  
S. JOHNSON ◽  
J. MARRO

Excitable media may be modeled as simple extensions of the Amari–Hopfield network with dynamic attractors. Some nodes chosen at random remain temporarily quiet, and some of the edges are switched off to adjust the network connectivity, while the weights of the other edges vary with activity. We conclude on the optimum wiring topology and describe nonequilibrium phases and criticality at the edge of irregular behavior.


2009 ◽  
Vol 19 (02) ◽  
pp. 677-686
Author(s):  
J. J. TORRES ◽  
J. MARRO ◽  
S. DE FRANCISCIS

We discuss an attractor neural network in which only a fraction ρ of nodes is simultaneously updated. In addition, the network has a heterogeneous distribution of connection weights and, depending on the current degree of order, connections are changed at random by a factor Φ on short-time scales. The resulting dynamic attractors may become unstable in a certain range of Φ thus ensuing chaotic itineracy which highly depends on ρ. For intermediate values of ρ, we observe that the number of attractors visited increases with ρ, and that the trajectory may change from regular to chaotic and vice versa as ρ is modified. Statistical analysis of time series shows a power-law spectra under conditions in which the attractors' space is most efficiently explored.


Sign in / Sign up

Export Citation Format

Share Document