scholarly journals Unsupervised Learning of Persistent and Sequential Activity

2018 ◽  
Author(s):  
Ulises Pereira ◽  
Nicolas Brunel

AbstractTwo strikingly distinct types of activity have been observed in various brain structures during delay periods of delayed response tasks: Persistent activity (PA), in which a sub-population of neurons maintains an elevated firing rate throughout an entire delay period; and Sequential activity (SA), in which sub-populations of neurons are activated sequentially in time. It has been hypothesized that both types of dynamics can be ‘learned’ by the relevant networks from the statistics of their inputs, thanks to mechanisms of synaptic plasticity. However, the necessary conditions for a synaptic plasticity rule and input statistics to learn these two types of dynamics in a stable fashion are still unclear. In particular, it is unclear whether a single learning rule is able to learn both types of activity patterns, depending on the statistics of the inputs driving the network. Here, we first characterize the complete bifurcation diagram of a firing rate model of multiple excitatory populations with an inhibitory mechanism, as a function of the parameters characterizing its connectivity. We then investigate how an unsupervised temporally asymmetric Hebbian plasticity rule shapes the dynamics of the network. Consistent with previous studies, we find that for stable learning of PA and SA, an additional stabilization mechanism, such as multiplicative homeostatic plasticity, is necessary. Using the bifurcation diagram derived for fixed connectivity, we study analytically the temporal evolution and the steady state of the learned recurrent architecture as a function of parameters characterizing the external inputs. Slow changing stimuli lead to PA, while fast changing stimuli lead to SA. Our network model shows how a network with plastic synapses can stably and flexibly learn PA and SA in an unsupervised manner.

2020 ◽  
Author(s):  
Basile Confavreux ◽  
Everton J. Agnes ◽  
Friedemann Zenke ◽  
Timothy Lillicrap ◽  
Tim P. Vogels

AbstractThe search for biologically faithful synaptic plasticity rules has resulted in a large body of models. They are usually inspired by – and fitted to – experimental data, but they rarely produce neural dynamics that serve complex functions. These failures suggest that current plasticity models are still under-constrained by existing data. Here, we present an alternative approach that uses meta-learning to discover plausible synaptic plasticity rules. Instead of experimental data, the rules are constrained by the functions they implement and the structure they are meant to produce. Briefly, we parameterize synaptic plasticity rules by a Volterra expansion and then use supervised learning methods (gradient descent or evolutionary strategies) to minimize a problem-dependent loss function that quantifies how effectively a candidate plasticity rule transforms an initially random network into one with the desired function. We first validate our approach by re-discovering previously described plasticity rules, starting at the single-neuron level and “Oja’s rule”, a simple Hebbian plasticity rule that captures the direction of most variability of inputs to a neuron (i.e., the first principal component). We expand the problem to the network level and ask the framework to find Oja’s rule together with an anti-Hebbian rule such that an initially random two-layer firing-rate network will recover several principal components of the input space after learning. Next, we move to networks of integrate-and-fire neurons with plastic inhibitory afferents. We train for rules that achieve a target firing rate by countering tuned excitation. Our algorithm discovers a specific subset of the manifold of rules that can solve this task. Our work is a proof of principle of an automated and unbiased approach to unveil synaptic plasticity rules that obey biological constraints and can solve complex functions.


2006 ◽  
Vol 18 (12) ◽  
pp. 2959-2993 ◽  
Author(s):  
Eduardo Ros ◽  
Richard Carrillo ◽  
Eva M. Ortigosa ◽  
Boris Barbour ◽  
Rodrigo Agís

Nearly all neuronal information processing and interneuronal communication in the brain involves action potentials, or spikes, which drive the short-term synaptic dynamics of neurons, but also their long-term dynamics, via synaptic plasticity. In many brain structures, action potential activity is considered to be sparse. This sparseness of activity has been exploited to reduce the computational cost of large-scale network simulations, through the development of event-driven simulation schemes. However, existing event-driven simulations schemes use extremely simplified neuronal models. Here, we implement and evaluate critically an event-driven algorithm (ED-LUT) that uses precalculated look-up tables to characterize synaptic and neuronal dynamics. This approach enables the use of more complex (and realistic) neuronal models or data in representing the neurons, while retaining the advantage of high-speed simulation. We demonstrate the method's application for neurons containing exponential synaptic conductances, thereby implementing shunting inhibition, a phenomenon that is critical to cellular computation. We also introduce an improved two-stage event-queue algorithm, which allows the simulations to scale efficiently to highly connected networks with arbitrary propagation delays. Finally, the scheme readily accommodates implementation of synaptic plasticity mechanisms that depend on spike timing, enabling future simulations to explore issues of long-term learning and adaptation in large-scale networks.


2020 ◽  
Author(s):  
Kramay Patel ◽  
Chaim N. Katz ◽  
Suneil K. Kalia ◽  
Milos R. Popovic ◽  
Taufik A. Valiante

AbstractCan the human brain, a complex interconnected structure of over 80 billion neurons learn to control itself at the most elemental scale – a single neuron. We directly linked the firing rate of a single (direct) neuron to the position of a box on a screen, which participants tried to control. Remarkably, all subjects upregulated the firing rate of the direct neuron in memory structures of their brain. Learning was accompanied by improved performance over trials, simultaneous decorrelation of the direct neuron to local neurons, and direct neuron to beta frequency oscillation phase-locking. Such previously unexplored neuroprosthetic skill learning within memory related brain structures, and associated beta frequency phase-locking implicates the ventral striatum. Our demonstration that humans can volitionally control neuronal activity in mnemonic structures, may provide new ways of probing the function and plasticity of human memory without exogenous stimulation.


2018 ◽  
Author(s):  
Damien Drix ◽  
Verena V. Hafner ◽  
Michael Schmuker

AbstractCortical neurons are silent most of the time. This sparse activity is energy efficient, and the resulting neural code has favourable properties for associative learning. Most neural models of sparse coding use some form of homeostasis to ensure that each neuron fires infrequently. But homeostatic plasticity acting on a fast timescale may not be biologically plausible, and could lead to catastrophic forgetting in embodied agents that learn continuously. We set out to explore whether inhibitory plasticity could play that role instead, regulating both the population sparseness and the average firing rates. We put the idea to the test in a hybrid network where rate-based dendritic compartments integrate the feedforward input, while spiking somas compete through recurrent inhibition. A somato-dendritic learning rule allows somatic inhibition to modulate nonlinear Hebbian learning in the dendrites. Trained on MNIST digits and natural images, the network discovers independent components that form a sparse encoding of the input and support linear decoding. These findings con-firm that intrinsic plasticity is not strictly required for regulating sparseness: inhibitory plasticity can have the same effect, although that mechanism comes with its own stability-plasticity dilemma. Going beyond point neuron models, the network illustrates how a learning rule can make use of dendrites and compartmentalised inputs; it also suggests a functional interpretation for clustered somatic inhibition in cortical neurons.


2002 ◽  
Vol 14 (10) ◽  
pp. 2353-2370 ◽  
Author(s):  
Terry Elliott ◽  
Jörg Kramer

We couple a previously studied, biologically inspired neurotrophic model of activity-dependent competitive synaptic plasticity and neuronal development to a neuromorphic retina chip. Using this system, we examine the development and refinement of a topographic mapping between an array of afferent neurons (the retinal ganglion cells) and an array of target neurons. We find that the plasticity model can indeed drive topographic refinement in the presence of afferent activity patterns generated by a real-world device. We examine the resilience of the developing system to the presence of high levels of noise by adjusting the spontaneous firing rate of the silicon neurons.


2017 ◽  
Vol 237 ◽  
pp. 193-199 ◽  
Author(s):  
D. Negrov ◽  
I. Karandashev ◽  
V. Shakirov ◽  
Yu. Matveyev ◽  
W. Dunin-Barkowski ◽  
...  

1992 ◽  
Vol 263 (3) ◽  
pp. R679-R684
Author(s):  
J. B. Dean ◽  
J. A. Boulant

Thermoregulatory responses may be delayed in onset and offset by several minutes after changes in hypothalamic temperature. Our preceding study found neurons that displayed delayed firing rate responses during clamped thermal stimulation in remote regions of rat diencephalic tissue slices. The present study looked for similar delayed firing rate responses during clamped (1.5-10 min) changes in each neuron's local temperature. Of 26 neurons tested with clamped thermal stimulation, six (i.e., 23%) showed delayed responses, with on-latencies of 1.0-7.8 min. These neurons rarely showed off-latencies, and the delayed response was not eliminated by synaptic blockade. The on-latencies and ranges of local thermosensitivity were similar to delayed neuronal responses to remote temperature; however, remote-sensitive neurons displayed off-latencies, higher firing rates at 37 degrees C, and greater sensitivity to thermal stimulation. Our findings suggest that delayed thermosensitivity is an intrinsic property of certain neurons and may initiate more elaborate or prolonged delayed responses in synaptically connected diencephalic networks. These networks could explain the delayed thermoregulatory responses observed during hypothalamic thermal stimulation.


2016 ◽  
Vol 2016 ◽  
pp. 1-19 ◽  
Author(s):  
Sung-Soo Jang ◽  
Hee Jung Chung

Alzheimer’s disease (AD) is an irreversible brain disorder characterized by progressive cognitive decline and neurodegeneration of brain regions that are crucial for learning and memory. Although intracellular neurofibrillary tangles and extracellular senile plaques, composed of insoluble amyloid-β(Aβ) peptides, have been the hallmarks of postmortem AD brains, memory impairment in early AD correlates better with pathological accumulation of soluble Aβoligomers and persistent weakening of excitatory synaptic strength, which is demonstrated by inhibition of long-term potentiation, enhancement of long-term depression, and loss of synapses. However, current, approved interventions aiming to reduce Aβlevels have failed to retard disease progression; this has led to a pressing need to identify and target alternative pathogenic mechanisms of AD. Recently, it has been suggested that the disruption of Hebbian synaptic plasticity in AD is due to aberrant metaplasticity, which is a form of homeostatic plasticity that tunes the magnitude and direction of future synaptic plasticity based on previous neuronal or synaptic activity. This review examines emerging evidence for aberrant metaplasticity in AD. Putative mechanisms underlying aberrant metaplasticity in AD will also be discussed. We hope this review inspires future studies to test the extent to which these mechanisms contribute to the etiology of AD and offer therapeutic targets.


2020 ◽  
Vol 123 (1) ◽  
pp. 134-148
Author(s):  
Boris Gourévitch ◽  
Elena J. Mahrt ◽  
Warren Bakay ◽  
Cameron Elde ◽  
Christine V. Portfors

Speech is our most important form of communication, yet we have a poor understanding of how communication sounds are processed by the brain. Mice make great model organisms to study neural processing of communication sounds because of their rich repertoire of social vocalizations and because they have brain structures analogous to humans, such as the auditory midbrain nucleus inferior colliculus (IC). Although the combined roles of GABAergic and glycinergic inhibition on vocalization selectivity in the IC have been studied to a limited degree, the discrete contributions of GABAergic inhibition have only rarely been examined. In this study, we examined how GABAergic inhibition contributes to shaping responses to pure tones as well as selectivity to complex sounds in the IC of awake mice. In our set of long-latency neurons, we found that GABAergic inhibition extends the evoked firing rate range of IC neurons by lowering the baseline firing rate but maintaining the highest probability of firing rate. GABAergic inhibition also prevented IC neurons from bursting in a spontaneous state. Finally, we found that although GABAergic inhibition shaped the spectrotemporal response to vocalizations in a nonlinear fashion, it did not affect the neural code needed to discriminate vocalizations, based either on spiking patterns or on firing rate. Overall, our results emphasize that even if GABAergic inhibition generally decreases the firing rate, it does so while maintaining or extending the abilities of neurons in the IC to code the wide variety of sounds that mammals are exposed to in their daily lives. NEW & NOTEWORTHY GABAergic inhibition adds nonlinearity to neuronal response curves. This increases the neuronal range of evoked firing rate by reducing baseline firing. GABAergic inhibition prevents bursting responses from neurons in a spontaneous state, reducing noise in the temporal coding of the neuron. This could result in improved signal transmission to the cortex.


Sign in / Sign up

Export Citation Format

Share Document