On the Information Storage Capacity of Local Learning Rules

1992 ◽  
Vol 4 (5) ◽  
pp. 703-711 ◽  
Author(s):  
Günther Palm

A simple relation between the storage capacity A for autoassociation and H for heteroassociation with a local learning rule is demonstrated: H = 2A. Both values are bounded by local learning bounds: A ≤ LA and H ≤ LH. LH = 2LA is evaluated numerically.

1992 ◽  
Vol 4 (5) ◽  
pp. 666-681 ◽  
Author(s):  
Peter König ◽  
Bernd Janosch ◽  
Thomas B. Schillen

A temporal structure of neuronal activity has been suggested as a potential mechanism for defining cell assemblies in the brain. This concept has recently gained support by the observation of stimulus-dependent oscillatory activity in the visual cortex of the cat. Furthermore, experimental evidence has been found showing the formation and segregation of synchronously oscillating cell assemblies in response to various stimulus conditions. In previous work, we have demonstrated that a network of neuronal oscillators coupled by synchronizing and desynchronizing delay connections can exhibit a temporal structure of responses, which closely resembles experimental observations. In this paper, we investigate the self-organization of synchronizing and desynchronizing coupling connections by local learning rules. Based on recent experimental observations, we modify synchronizing connections according to a two-threshold learning rule, involving synaptic potentiation and depression. This rule is generalized to its functional inverse for weight changes of desynchronizing connections. We show that after training, the resulting network exhibits stimulus-dependent formation and segregation of oscillatory assemblies in agreement with the experimental data. These results indicate that local learning rules during ontogenesis can suffice to develop a connectivity pattern in support of the observed temporal structure of stimulus responses in cat visual cortex.


2020 ◽  
Author(s):  
Francesca Schönsberg ◽  
Yasser Roudi ◽  
Alessandro Treves

We show that associative networks of threshold linear units endowed with Hebbian learning can operate closer to the Gardner optimal storage capacity than their binary counterparts and even surpass this bound. This is largely achieved through a sparsification of the retrieved patterns, which we analyze for theoretical and empirical distributions of activity. As reaching the optimal capacity via non-local learning rules like back-propagation requires slow and neurally implausible training procedures, our results indicate that one-shot self-organized Hebbian learning can be just as efficient.


2020 ◽  
Vol 11 (1) ◽  
Author(s):  
Jung Min Lee ◽  
Mo Beom Koo ◽  
Seul Woo Lee ◽  
Heelim Lee ◽  
Junho Kwon ◽  
...  

AbstractSynthesis of a polymer composed of a large discrete number of chemically distinct monomers in an absolutely defined aperiodic sequence remains a challenge in polymer chemistry. The synthesis has largely been limited to oligomers having a limited number of repeating units due to the difficulties associated with the step-by-step addition of individual monomers to achieve high molecular weights. Here we report the copolymers of α-hydroxy acids, poly(phenyllactic-co-lactic acid) (PcL) built via the cross-convergent method from four dyads of monomers as constituent units. Our proposed method allows scalable synthesis of sequence-defined PcL in a minimal number of coupling steps from reagents in stoichiometric amounts. Digital information can be stored in an aperiodic sequence of PcL, which can be fully retrieved as binary code by mass spectrometry sequencing. The information storage density (bit/Da) of PcL is 50% higher than DNA, and the storage capacity of PcL can also be increased by adjusting the molecular weight (~38 kDa).


2019 ◽  
Author(s):  
Michael E. Rule ◽  
Adrianna R. Loback ◽  
Dhruva V. Raman ◽  
Laura Driscoll ◽  
Christopher D. Harvey ◽  
...  

AbstractOver days and weeks, neural activity representing an animal’s position and movement in sensorimotor cortex has been found to continually reconfigure or ‘drift’ during repeated trials of learned tasks, with no obvious change in behavior. This challenges classical theories which assume stable engrams underlie stable behavior. However, it is not known whether this drift occurs systematically, allowing downstream circuits to extract consistent information. We show that drift is systematically constrained far above chance, facilitating a linear weighted readout of behavioural variables. However, a significant component of drift continually degrades a fixed readout, implying that drift is not confined to a null coding space. We calculate the amount of plasticity required to compensate drift independently of any learning rule, and find that this is within physiologically achievable bounds. We demonstrate that a simple, biologically plausible local learning rule can achieve these bounds, accurately decoding behavior over many days.


2021 ◽  
Vol 118 (45) ◽  
pp. e2024890118
Author(s):  
Shu Ho ◽  
Rebecca Lajaunie ◽  
Marion Lerat ◽  
Mickaël Le ◽  
Valérie Crépel ◽  
...  

Cerebellar Purkinje neurons integrate information transmitted at excitatory synapses formed by granule cells. Although these synapses are considered essential sites for learning, most of them appear not to transmit any detectable electrical information and have been defined as silent. It has been proposed that silent synapses are required to maximize information storage capacity and ensure its reliability, and hence to optimize cerebellar operation. Such optimization is expected to occur once the cerebellar circuitry is in place, during its maturation and the natural and steady improvement of animal agility. We therefore investigated whether the proportion of silent synapses varies over this period, from the third to the sixth postnatal week in mice. Selective expression of a calcium indicator in granule cells enabled quantitative mapping of presynaptic activity, while postsynaptic responses were recorded by patch clamp in acute slices. Through this approach and the assessment of two anatomical features (the distance that separates adjacent planar Purkinje dendritic trees and the synapse density), we determined the average excitatory postsynaptic potential per synapse. Its value was four to eight times smaller than responses from paired recorded detectable connections, consistent with over 70% of synapses being silent. These figures remained remarkably stable across maturation stages. According to the proposed role for silent synapses, our results suggest that information storage capacity and reliability are optimized early during cerebellar maturation. Alternatively, silent synapses may have roles other than adjusting the information storage capacity and reliability.


2020 ◽  
Vol 117 (47) ◽  
pp. 29948-29958
Author(s):  
Maxwell Gillett ◽  
Ulises Pereira ◽  
Nicolas Brunel

Sequential activity has been observed in multiple neuronal circuits across species, neural structures, and behaviors. It has been hypothesized that sequences could arise from learning processes. However, it is still unclear whether biologically plausible synaptic plasticity rules can organize neuronal activity to form sequences whose statistics match experimental observations. Here, we investigate temporally asymmetric Hebbian rules in sparsely connected recurrent rate networks and develop a theory of the transient sequential activity observed after learning. These rules transform a sequence of random input patterns into synaptic weight updates. After learning, recalled sequential activity is reflected in the transient correlation of network activity with each of the stored input patterns. Using mean-field theory, we derive a low-dimensional description of the network dynamics and compute the storage capacity of these networks. Multiple temporal characteristics of the recalled sequential activity are consistent with experimental observations. We find that the degree of sparseness of the recalled sequences can be controlled by nonlinearities in the learning rule. Furthermore, sequences maintain robust decoding, but display highly labile dynamics, when synaptic connectivity is continuously modified due to noise or storage of other patterns, similar to recent observations in hippocampus and parietal cortex. Finally, we demonstrate that our results also hold in recurrent networks of spiking neurons with separate excitatory and inhibitory populations.


2004 ◽  
Vol 14 (01) ◽  
pp. 1-8 ◽  
Author(s):  
RALF MÖLLER

The paper reviews single-neuron learning rules for minor component analysis and suggests a novel minor component learning rule. In this rule, the weight vector length is self-stabilizing, i.e., moving towards unit length in each learning step. In simulations with low- and medium-dimensional data, the performance of the novel learning rule is compared with previously suggested rules.


2011 ◽  
Vol 23 (12) ◽  
pp. 3145-3161 ◽  
Author(s):  
Jian K. Liu

It has been established that homeostatic synaptic scaling plasticity can maintain neural network activity in a stable regime. However, the underlying learning rule for this mechanism is still unclear. Whether it is dependent on the presynaptic site remains a topic of debate. Here we focus on two forms of learning rules: traditional synaptic scaling (SS) without presynaptic effect and presynaptic-dependent synaptic scaling (PSD). Analysis of the synaptic matrices reveals that transition matrices between consecutive synaptic matrices are distinct: they are diagonal and linear to neural activity under SS, but become nondiagonal and nonlinear under PSD. These differences produce different dynamics in recurrent neural networks. Numerical simulations show that network dynamics are stable under PSD but not SS, which suggests that PSD is a better form to describe homeostatic synaptic scaling plasticity. Matrix analysis used in the study may provide a novel way to examine the stability of learning dynamics.


Sign in / Sign up

Export Citation Format

Share Document