Learning Rule of Homeostatic Synaptic Scaling: Presynaptic Dependent or Not

2011 ◽  
Vol 23 (12) ◽  
pp. 3145-3161 ◽  
Author(s):  
Jian K. Liu

It has been established that homeostatic synaptic scaling plasticity can maintain neural network activity in a stable regime. However, the underlying learning rule for this mechanism is still unclear. Whether it is dependent on the presynaptic site remains a topic of debate. Here we focus on two forms of learning rules: traditional synaptic scaling (SS) without presynaptic effect and presynaptic-dependent synaptic scaling (PSD). Analysis of the synaptic matrices reveals that transition matrices between consecutive synaptic matrices are distinct: they are diagonal and linear to neural activity under SS, but become nondiagonal and nonlinear under PSD. These differences produce different dynamics in recurrent neural networks. Numerical simulations show that network dynamics are stable under PSD but not SS, which suggests that PSD is a better form to describe homeostatic synaptic scaling plasticity. Matrix analysis used in the study may provide a novel way to examine the stability of learning dynamics.

2018 ◽  
Vol 8 (3) ◽  
pp. 237-249 ◽  
Author(s):  
Teijiro Isokawa ◽  
Hiroki Yamamoto ◽  
Haruhiko Nishimura ◽  
Takayuki Yumoto ◽  
Naotake Kamiura ◽  
...  

AbstractIn this paper, we investigate the stability of patterns embedded as the associative memory distributed on the complex-valued Hopfield neural network, in which the neuron states are encoded by the phase values on a unit circle of complex plane. As learning schemes for embedding patterns onto the network, projection rule and iterative learning rule are formally expanded to the complex-valued case. The retrieval of patterns embedded by iterative learning rule is demonstrated and the stability for embedded patterns is quantitatively investigated.


2014 ◽  
Vol 2014 ◽  
pp. 1-7 ◽  
Author(s):  
Lei Ding ◽  
Hong-Bing Zeng ◽  
Wei Wang ◽  
Fei Yu

This paper investigates the stability of static recurrent neural networks (SRNNs) with a time-varying delay. Based on the complete delay-decomposing approach and quadratic separation framework, a novel Lyapunov-Krasovskii functional is constructed. By employing a reciprocally convex technique to consider the relationship between the time-varying delay and its varying interval, some improved delay-dependent stability conditions are presented in terms of linear matrix inequalities (LMIs). Finally, a numerical example is provided to show the merits and the effectiveness of the proposed methods.


2008 ◽  
Vol 6 (37) ◽  
pp. 655-668 ◽  
Author(s):  
Cristina Savin ◽  
Jochen Triesch ◽  
Michael Meyer-Hermann

Homeostatic regulation of neuronal activity is fundamental for the stable functioning of the cerebral cortex. One form of homeostatic synaptic scaling has been recently shown to be mediated by glial cells that interact with neurons through the diffusible messenger tumour necrosis factor-α (TNF-α). Interestingly, TNF-α is also used by the immune system as a pro-inflammatory messenger, suggesting potential interactions between immune system signalling and the homeostatic regulation of neuronal activity. We present the first computational model of neuron–glia interaction in TNF-α-mediated synaptic scaling. The model shows how under normal conditions the homeostatic mechanism is effective in balancing network activity. After chronic immune activation or TNF-α overexpression by glia, however, the network develops seizure-like activity patterns. This may explain why under certain conditions brain inflammation increases the risk of seizures. Additionally, the model shows that TNF-α diffusion may be responsible for epileptogenesis after localized brain lesions.


2021 ◽  
pp. 1-43
Author(s):  
Alfred Rajakumar ◽  
John Rinzel ◽  
Zhe S. Chen

Abstract Recurrent neural networks (RNNs) have been widely used to model sequential neural dynamics (“neural sequences”) of cortical circuits in cognitive and motor tasks. Efforts to incorporate biological constraints and Dale's principle will help elucidate the neural representations and mechanisms of underlying circuits. We trained an excitatory-inhibitory RNN to learn neural sequences in a supervised manner and studied the representations and dynamic attractors of the trained network. The trained RNN was robust to trigger the sequence in response to various input signals and interpolated a time-warped input for sequence representation. Interestingly, a learned sequence can repeat periodically when the RNN evolved beyond the duration of a single sequence. The eigenspectrum of the learned recurrent connectivity matrix with growing or damping modes, together with the RNN's nonlinearity, were adequate to generate a limit cycle attractor. We further examined the stability of dynamic attractors while training the RNN to learn two sequences. Together, our results provide a general framework for understanding neural sequence representation in the excitatory-inhibitory RNN.


2019 ◽  
Vol 862 ◽  
pp. 200-215 ◽  
Author(s):  
Minwoo Lee ◽  
Yuanhang Zhu ◽  
Larry K. B. Li ◽  
Vikrant Gupta

Low-density jets are central to many natural and industrial processes. Under certain conditions, they can develop global oscillations at a limit cycle, behaving as a prototypical example of a self-excited hydrodynamic oscillator. In this study, we perform system identification of a low-density jet using measurements of its noise-induced dynamics in the unconditionally stable regime, prior to both the Hopf and saddle-node points. We show that this approach can enable prediction of (i) the order of nonlinearity, (ii) the locations and types of the bifurcation points (and hence the stability boundaries) and (iii) the resulting limit-cycle oscillations. The only assumption made about the system is that it obeys a Stuart–Landau equation in the vicinity of the Hopf point, thus making the method applicable to a variety of hydrodynamic systems. This study constitutes the first experimental demonstration of system identification using the noise-induced dynamics in only the unconditionally stable regime, i.e. away from the regimes where limit-cycle oscillations may occur. This opens up new possibilities for the prediction and analysis of the stability and nonlinear behaviour of hydrodynamic systems.


2020 ◽  
Vol 117 (47) ◽  
pp. 29948-29958
Author(s):  
Maxwell Gillett ◽  
Ulises Pereira ◽  
Nicolas Brunel

Sequential activity has been observed in multiple neuronal circuits across species, neural structures, and behaviors. It has been hypothesized that sequences could arise from learning processes. However, it is still unclear whether biologically plausible synaptic plasticity rules can organize neuronal activity to form sequences whose statistics match experimental observations. Here, we investigate temporally asymmetric Hebbian rules in sparsely connected recurrent rate networks and develop a theory of the transient sequential activity observed after learning. These rules transform a sequence of random input patterns into synaptic weight updates. After learning, recalled sequential activity is reflected in the transient correlation of network activity with each of the stored input patterns. Using mean-field theory, we derive a low-dimensional description of the network dynamics and compute the storage capacity of these networks. Multiple temporal characteristics of the recalled sequential activity are consistent with experimental observations. We find that the degree of sparseness of the recalled sequences can be controlled by nonlinearities in the learning rule. Furthermore, sequences maintain robust decoding, but display highly labile dynamics, when synaptic connectivity is continuously modified due to noise or storage of other patterns, similar to recent observations in hippocampus and parietal cortex. Finally, we demonstrate that our results also hold in recurrent networks of spiking neurons with separate excitatory and inhibitory populations.


2011 ◽  
Vol 121-126 ◽  
pp. 4764-4769
Author(s):  
Ying Cai Yuan ◽  
Yan Li ◽  
Yi Ming Wang ◽  
Qiang Guo

High velocity and stability are the development trend and inevitable requirement, but the clearance would make the stability of mechanical system deceased, especially in high speed. To the folder mechanism with clearances in high velocity, combined with the definition of sensitivity and the kinematics analysis, the kinematics sensitivity analysis model is derived by the matrix analysis method. Through the sensitivity analysis model, it can be easy to get the relationship of the design variables and the mechanism’s robustness, which provides the base to design the folder mechanism in high velocity.


1999 ◽  
Vol 09 (02) ◽  
pp. 95-98 ◽  
Author(s):  
ANKE MEYER-BÄSE

This paper is concerned with the asymptotic hyperstability of recurrent neural networks. We derive based on the stability results necessary and sufficient conditions for the network parameters. The results we achieve are more general than those based on Lyapunov methods, since they provide milder constraints on the connection weights than the conventional results and do not suppose symmetry of the weights.


Sign in / Sign up

Export Citation Format

Share Document