scholarly journals Characteristics of sequential activity in networks with temporally asymmetric Hebbian learning

2020 ◽  
Vol 117 (47) ◽  
pp. 29948-29958
Author(s):  
Maxwell Gillett ◽  
Ulises Pereira ◽  
Nicolas Brunel

Sequential activity has been observed in multiple neuronal circuits across species, neural structures, and behaviors. It has been hypothesized that sequences could arise from learning processes. However, it is still unclear whether biologically plausible synaptic plasticity rules can organize neuronal activity to form sequences whose statistics match experimental observations. Here, we investigate temporally asymmetric Hebbian rules in sparsely connected recurrent rate networks and develop a theory of the transient sequential activity observed after learning. These rules transform a sequence of random input patterns into synaptic weight updates. After learning, recalled sequential activity is reflected in the transient correlation of network activity with each of the stored input patterns. Using mean-field theory, we derive a low-dimensional description of the network dynamics and compute the storage capacity of these networks. Multiple temporal characteristics of the recalled sequential activity are consistent with experimental observations. We find that the degree of sparseness of the recalled sequences can be controlled by nonlinearities in the learning rule. Furthermore, sequences maintain robust decoding, but display highly labile dynamics, when synaptic connectivity is continuously modified due to noise or storage of other patterns, similar to recent observations in hippocampus and parietal cortex. Finally, we demonstrate that our results also hold in recurrent networks of spiking neurons with separate excitatory and inhibitory populations.

2019 ◽  
Author(s):  
Maxwell Gillett ◽  
Ulises Pereira ◽  
Nicolas Brunel

Sequential activity has been observed in multiple neuronal circuits across species, neural structures, and behaviors. It has been hypothesized that sequences could arise from unsupervised learning processes. However, it is still unclear whether biologically plausible synaptic plasticity rules can organize neuronal activity to form sequences whose statistics match experimental observations. Here we investigate temporally asymmetric Hebbian rules in sparsely connected recurrent rate networks, and develop a theory of the transient sequential activity observed after learning. These rules transform a sequence of random input patterns into synaptic weight updates. After learning, recalled sequential activity is reflected in the transient correlation of network activity with each of the stored input patterns. Using mean-field theory, we derive a low-dimensional description of the network dynamics and compute the storage capacity of these networks. Multiple temporal characteristics of the recalled sequential activity are consistent with experimental observations. We find that the degree of sparseness of the recalled sequences can be controlled by non-linearities in the learning rule. Furthermore, sequences maintain robust decoding, but display highly labile dynamics, when synaptic connectivity is continuously modified due to noise or storage of other patterns, similar to recent observations in hippocampus and parietal cortex. Finally, we demonstrate that our results also hold in recurrent networks of spiking neurons with separate excitatory and inhibitory populations.


1991 ◽  
Vol 02 (04) ◽  
pp. 315-322
Author(s):  
Mats Bengtsson

We have investigated the storage capacity in the limit of large N (the network size) for a third order recurrent artificial neural network with Hebbian learning. Numerical results for the relation between the overlap to stored patterns, and the fraction of the number of stored patterns and N2 (the m—α relation), agree well with replica symmetric predictions. A comparative study is made of the m—α relation for a third and a second order network. Large differences exist between these two models, usually to the favour of the third order network. This result stands in some contrast to previous investigations. The phase transition temperature is investigated numerically and compared with mean field theory predictions.


1989 ◽  
Vol 03 (07) ◽  
pp. 555-560 ◽  
Author(s):  
M.V. TSODYKS

We consider the Hopfield model with the most simple form of the Hebbian learning rule, when only simultaneous activity of pre- and post-synaptic neurons leads to modification of synapse. An extra inhibition proportional to full network activity is needed. Both symmetric nondiluted and asymmetric diluted networks are considered. The model performs well at extremely low level of activity p<K−1/2, where K is the mean number of synapses per neuron.


2006 ◽  
Vol 20 (19) ◽  
pp. 2624-2635
Author(s):  
KAREN HALLBERG

Since its inception, the DMRG method has been a very powerful tool for the calculation of physical properties of low-dimensional strongly correlated systems. It has been adapted to obtain dynamical properties and to consider finite temperature, time-dependent problems, bosonic degrees of freedom, the treatment of classical problems and non-equilibrium systems, among others. We will briefly review the method and then concentrate on its latest developments, describing some recent successful applications. In particular we will show how the dynamical DMRG can be used together with the Dynamical Mean Field Theory (DMFT) to solve the associated impurity problem in the infinite-dimensional Hubbard model. This method is used to obtain spectral properties of strongly correlated systems. With this algorithm, more complex problems having a larger number of degrees of freedom can be considered and finite-size effects can be minimized.


Author(s):  
Jun Li ◽  
Yuhong Huang ◽  
Hongkuan Yuan ◽  
Hong Chen

Two-dimensional (2D) magnetic semiconductors have great promising for energy-efficient ultracompact spintronics due to the low-dimensional ferromagnetic and semiconducting behavior. Here, we predict hexagonal titanium nitride monolayer ($h$-TiN) to be a ferromagnetic semiconductor by investigating stability, magnetism, and carrier transport of $h$-TiN using the first-principles calculations. The thermodynamical stability of $h$-TiN is revealed by phonon dispersion, molecular dynamics simulation and formation energy. The energy band structure shows that $h$-TiN is a ferromagnetic semiconductor with medium magnetic anisotropy, the magnetic moment of 1$\mu_{B}$ and the band gaps of 1.33 and 4.42 eV for spin-up and -down channels, respectively. The Curie temperature of $h$-TiN is estimated to be about 205 K by mean-field theory and not enhanced by the compressive and tensile strains. Higher carrier mobility, in-plane stiffness and conductivity indicate that $h$-TiN has favorable transport performance. The ferromagnetic semiconducting behavior is robust against the external strains, indicating that $h$-TiN could be a rare candidate for nanoscale spintronic devices.


2018 ◽  
Vol 29 (3) ◽  
pp. 937-951 ◽  
Author(s):  
Gabriel Koch Ocker ◽  
Brent Doiron

Abstract The synaptic connectivity of cortex is plastic, with experience shaping the ongoing interactions between neurons. Theoretical studies of spike timing-dependent plasticity (STDP) have focused on either just pairs of neurons or large-scale simulations. A simple analytic account for how fast spike time correlations affect both microscopic and macroscopic network structure is lacking. We develop a low-dimensional mean field theory for STDP in recurrent networks and show the emergence of assemblies of strongly coupled neurons with shared stimulus preferences. After training, this connectivity is actively reinforced by spike train correlations during the spontaneous dynamics. Furthermore, the stimulus coding by cell assemblies is actively maintained by these internally generated spiking correlations, suggesting a new role for noise correlations in neural coding. Assembly formation has often been associated with firing rate-based plasticity schemes; our theory provides an alternative and complementary framework, where fine temporal correlations and STDP form and actively maintain learned structure in cortical networks.


2007 ◽  
Vol 19 (12) ◽  
pp. 3262-3292 ◽  
Author(s):  
Hédi Soula ◽  
Carson C. Chow

We present a simple Markov model of spiking neural dynamics that can be analytically solved to characterize the stochastic dynamics of a finite-size spiking neural network. We give closed-form estimates for the equilibrium distribution, mean rate, variance, and autocorrelation function of the network activity. The model is applicable to any network where the probability of firing of a neuron in the network depends on only the number of neurons that fired in a previous temporal epoch. Networks with statistically homogeneous connectivity and membrane and synaptic time constants that are not excessively long could satisfy these conditions. Our model completely accounts for the size of the network and correlations in the firing activity. It also allows us to examine how the network dynamics can deviate from mean field theory. We show that the model and solutions are applicable to spiking neural networks in biophysically plausible parameter regimes.


2016 ◽  
Author(s):  
Gabriel Koch Ocker ◽  
Brent Doiron

AbstractThe synaptic connectivity of cortex is plastic, with experience shaping the ongoing interactions between neurons. Theoretical studies of spike timing–dependent plasticity (STDP) have focused on either just pairs of neurons or large-scale simulations. A simple analytic account for how fast spike time correlations affect both micro- and macroscopic network structure is lacking. We develop a low-dimensional mean field theory for STDP in recurrent networks and show the emergence of assemblies of strongly reciprocally coupled neurons with shared stimulus preferences. After training this connectivity is actively reinforced by spike train correlations during the spontaneous dynamics. Furthermore, the stimulus coding by cell assemblies is actively maintained by these internally generated spiking correlations, suggesting a new role for noise correlations in neural coding. Assembly formation has been often associated with firing rate-based plasticity schemes; our theory provides an alternative and complementary framework, where fine temporal correlations and STDP form and actively maintain learned structure in cortical networks.


Sign in / Sign up

Export Citation Format

Share Document