ASSOCIATIVE MEMORY IN NEURAL NETWORKS WITH THE HEBBIAN LEARNING RULE

1989 ◽  
Vol 03 (07) ◽  
pp. 555-560 ◽  
Author(s):  
M.V. TSODYKS

We consider the Hopfield model with the most simple form of the Hebbian learning rule, when only simultaneous activity of pre- and post-synaptic neurons leads to modification of synapse. An extra inhibition proportional to full network activity is needed. Both symmetric nondiluted and asymmetric diluted networks are considered. The model performs well at extremely low level of activity p<K−1/2, where K is the mean number of synapses per neuron.

1991 ◽  
Vol 3 (2) ◽  
pp. 201-212 ◽  
Author(s):  
Peter J. B. Hancock ◽  
Leslie S. Smith ◽  
William A. Phillips

We show that a form of synaptic plasticity recently discovered in slices of the rat visual cortex (Artola et al. 1990) can support an error-correcting learning rule. The rule increases weights when both pre- and postsynaptic units are highly active, and decreases them when pre-synaptic activity is high and postsynaptic activation is less than the threshold for weight increment but greater than a lower threshold. We show that this rule corrects false positive outputs in feedforward associative memory, that in an appropriate opponent-unit architecture it corrects misses, and that it performs better than the optimal Hebbian learning rule reported by Willshaw and Dayan (1990).


2009 ◽  
Vol 72 (10-12) ◽  
pp. 2477-2482 ◽  
Author(s):  
Alexander Goltsev ◽  
Vladimir Gritsenko

2006 ◽  
Vol 02 (03) ◽  
pp. 237-253 ◽  
Author(s):  
AMMAR BELATRECHE ◽  
LIAM P. MAGUIRE ◽  
MARTIN MCGINNITY ◽  
QING XIANG WU

Unlike traditional artificial neural networks (ANNs), which use a high abstraction of real neurons, spiking neural networks (SNNs) offer a biologically plausible model of realistic neurons. They differ from classical artificial neural networks in that SNNs handle and communicate information by means of timing of individual pulses, an important feature of neuronal systems being ignored by models based on rate coding scheme. However, in order to make the most of these realistic neuronal models, good training algorithms are required. Most existing learning paradigms tune the synaptic weights in an unsupervised way using an adaptation of the famous Hebbian learning rule, which is based on the correlation between the pre- and post-synaptic neurons activity. Nonetheless, supervised learning is more appropriate when prior knowledge about the outcome of the network is available. In this paper, a new approach for supervised training is presented with a biologically plausible architecture. An adapted evolutionary strategy (ES) is used for adjusting the synaptic strengths and delays, which underlie the learning and memory processes in the nervous system. The algorithm is applied to complex non-linearly separable problems, and the results show that the network is able to perform learning successfully by means of temporal encoding of presented patterns.


2011 ◽  
Vol 225-226 ◽  
pp. 479-482
Author(s):  
Min Xia ◽  
Ying Cao Zhang ◽  
Xiao Ling Ye

Nonlinear function constitution and dynamic synapses, against spurious state for Hopfield neural network are proposed. The model of the dynamical connection weight and the updating scheme of the states of neurons are given. Nonlinear function constitution improves the conventional Hebbian learning rule with linear outer product method. Simulation results show that both nonlinear function constitution and dynamic synapses can effectively increase the ability of error tolerance; furthermore, associative memory of neural network with the new method can both enlarge attractive basin and increase storage capacity.


2017 ◽  
Vol 7 (4) ◽  
pp. 257-264 ◽  
Author(s):  
Toshifumi Minemoto ◽  
Teijiro Isokawa ◽  
Haruhiko Nishimura ◽  
Nobuyuki Matsui

AbstractHebbian learning rule is well known as a memory storing scheme for associative memory models. This scheme is simple and fast, however, its performance gets decreased when memory patterns are not orthogonal each other. Pseudo-orthogonalization is a decorrelating method for memory patterns which uses XNOR masking between the memory patterns and randomly generated patterns. By a combination of this method and Hebbian learning rule, storage capacity of associative memory concerning non-orthogonal patterns is improved without high computational cost. The memory patterns can also be retrieved based on a simulated annealing method by using an external stimulus pattern. By utilizing complex numbers and quaternions, we can extend the pseudo-orthogonalization for complex-valued and quaternionic Hopfield neural networks. In this paper, the extended pseudo-orthogonalization methods for associative memories based on complex numbers and quaternions are examined from the viewpoint of correlations in memory patterns. We show that the method has stable recall performance on highly correlated memory patterns compared to the conventional real-valued method.


2008 ◽  
Vol 20 (12) ◽  
pp. 2937-2966 ◽  
Author(s):  
Benoît Siri ◽  
Hugues Berry ◽  
Bruno Cessac ◽  
Bruno Delord ◽  
Mathias Quoy

We present a mathematical analysis of the effects of Hebbian learning in random recurrent neural networks, with a generic Hebbian learning rule, including passive forgetting and different timescales, for neuronal activity and learning dynamics. Previous numerical work has reported that Hebbian learning drives the system from chaos to a steady state through a sequence of bifurcations. Here, we interpret these results mathematically and show that these effects, involving a complex coupling between neuronal dynamics and synaptic graph structure, can be analyzed using Jacobian matrices, which introduce both a structural and a dynamical point of view on neural network evolution. Furthermore, we show that sensitivity to a learned pattern is maximal when the largest Lyapunov exponent is close to 0. We discuss how neural networks may take advantage of this regime of high functional interest.


Author(s):  
Enrique Mérida-Casermeiro ◽  
Domingo López-Rodríguez ◽  
J.M. Ortiz-de-Lazcano-Lobato

In this chapter, two important issues concerning associative memory by neural networks are studied: a new model of hebbian learning, as well as the effect of the network capacity when retrieving patterns and performing clustering tasks. Particularly, an explanation of the energy function when the capacity is exceeded: the limitation in pattern storage implies that similar patterns are going to be identified by the network, therefore forming different clusters. This ability can be translated as an unsupervised learning of pattern clusters, with one major advantage over most clustering algorithms: the number of data classes is automatically learned, as confirmed by the experiments. Two methods to reinforce learning are proposed to improve the quality of the clustering, by enhancing the learning of patterns relationships. As a related issue, a study on the net capacity, depending on the number of neurons and possible outputs, is presented, and some interesting conclusions are commented.


2020 ◽  
Vol 117 (47) ◽  
pp. 29948-29958
Author(s):  
Maxwell Gillett ◽  
Ulises Pereira ◽  
Nicolas Brunel

Sequential activity has been observed in multiple neuronal circuits across species, neural structures, and behaviors. It has been hypothesized that sequences could arise from learning processes. However, it is still unclear whether biologically plausible synaptic plasticity rules can organize neuronal activity to form sequences whose statistics match experimental observations. Here, we investigate temporally asymmetric Hebbian rules in sparsely connected recurrent rate networks and develop a theory of the transient sequential activity observed after learning. These rules transform a sequence of random input patterns into synaptic weight updates. After learning, recalled sequential activity is reflected in the transient correlation of network activity with each of the stored input patterns. Using mean-field theory, we derive a low-dimensional description of the network dynamics and compute the storage capacity of these networks. Multiple temporal characteristics of the recalled sequential activity are consistent with experimental observations. We find that the degree of sparseness of the recalled sequences can be controlled by nonlinearities in the learning rule. Furthermore, sequences maintain robust decoding, but display highly labile dynamics, when synaptic connectivity is continuously modified due to noise or storage of other patterns, similar to recent observations in hippocampus and parietal cortex. Finally, we demonstrate that our results also hold in recurrent networks of spiking neurons with separate excitatory and inhibitory populations.


Sign in / Sign up

Export Citation Format

Share Document