scholarly journals Novel local learning rule for neural adaptation fits Hopfield memory networks efficiently and optimally

2013 ◽  
Vol 14 (S1) ◽  
Author(s):  
Chris Hillar ◽  
Jascha Sohl-Dickstein ◽  
Kilian Koepsell
2019 ◽  
Author(s):  
Michael E. Rule ◽  
Adrianna R. Loback ◽  
Dhruva V. Raman ◽  
Laura Driscoll ◽  
Christopher D. Harvey ◽  
...  

AbstractOver days and weeks, neural activity representing an animal’s position and movement in sensorimotor cortex has been found to continually reconfigure or ‘drift’ during repeated trials of learned tasks, with no obvious change in behavior. This challenges classical theories which assume stable engrams underlie stable behavior. However, it is not known whether this drift occurs systematically, allowing downstream circuits to extract consistent information. We show that drift is systematically constrained far above chance, facilitating a linear weighted readout of behavioural variables. However, a significant component of drift continually degrades a fixed readout, implying that drift is not confined to a null coding space. We calculate the amount of plasticity required to compensate drift independently of any learning rule, and find that this is within physiologically achievable bounds. We demonstrate that a simple, biologically plausible local learning rule can achieve these bounds, accurately decoding behavior over many days.


1997 ◽  
Vol 9 (8) ◽  
pp. 1661-1665 ◽  
Author(s):  
Ralph Linsker

This note presents a local learning rule that enables a network to maximize the mutual information between input and output vectors. The network's output units may be nonlinear, and the distribution of input vectors is arbitrary. The local algorithm also serves to compute the inverse C−1 of an arbitrary square connection weight matrix.


1992 ◽  
Vol 4 (5) ◽  
pp. 703-711 ◽  
Author(s):  
Günther Palm

A simple relation between the storage capacity A for autoassociation and H for heteroassociation with a local learning rule is demonstrated: H = 2A. Both values are bounded by local learning bounds: A ≤ LA and H ≤ LH. LH = 2LA is evaluated numerically.


2000 ◽  
Vol 12 (3) ◽  
pp. 519-529 ◽  
Author(s):  
Manuel A. Sánchez-Montañés ◽  
Paul F. M. J. Verschure ◽  
Peter König

Mechanisms influencing learning in neural networks are usually investigated on either a local or a global scale. The former relates to synaptic processes, the latter to unspecific modulatory systems. Here we study the interaction of a local learning rule that evaluates coincidences of pre- and postsynaptic action potentials and a global modulatory mechanism, such as the action of the basal forebrain onto cortical neurons. The simulations demonstrate that the interaction of these mechanisms leads to a learning rule supporting fast learning rates, stability, and flexibility. Furthermore, the simulations generate two experimentally testable predictions on the dependence of backpropagating action potential on basal forebrain activity and the relative timing of the activity of inhibitory and excitatory neurons in the neocortex.


1997 ◽  
Vol 9 (3) ◽  
pp. 595-606 ◽  
Author(s):  
Marc M. Van Hulle

This article introduces an extremely simple and local learning rule for to pographic map formation. The rule, called the maximum entropy learning rule (MER), maximizes the unconditional entropy of the map's output for any type of input distribution. The aim of this article is to show that MER is a viable strategy for building topographic maps that maximize the average mutual information of the output responses to noiseless input signals when only input noise and noise-added input signals are available.


1992 ◽  
Vol 4 (5) ◽  
pp. 666-681 ◽  
Author(s):  
Peter König ◽  
Bernd Janosch ◽  
Thomas B. Schillen

A temporal structure of neuronal activity has been suggested as a potential mechanism for defining cell assemblies in the brain. This concept has recently gained support by the observation of stimulus-dependent oscillatory activity in the visual cortex of the cat. Furthermore, experimental evidence has been found showing the formation and segregation of synchronously oscillating cell assemblies in response to various stimulus conditions. In previous work, we have demonstrated that a network of neuronal oscillators coupled by synchronizing and desynchronizing delay connections can exhibit a temporal structure of responses, which closely resembles experimental observations. In this paper, we investigate the self-organization of synchronizing and desynchronizing coupling connections by local learning rules. Based on recent experimental observations, we modify synchronizing connections according to a two-threshold learning rule, involving synaptic potentiation and depression. This rule is generalized to its functional inverse for weight changes of desynchronizing connections. We show that after training, the resulting network exhibits stimulus-dependent formation and segregation of oscillatory assemblies in agreement with the experimental data. These results indicate that local learning rules during ontogenesis can suffice to develop a connectivity pattern in support of the observed temporal structure of stimulus responses in cat visual cortex.


Sign in / Sign up

Export Citation Format

Share Document