LEARNING RULES FOR ASSOCIATIVE MEMORIES

1991 ◽  
Vol 05 (30) ◽  
pp. 1963-1972
Author(s):  
EDUARDO G. VERGINI ◽  
MARCELO G. BLATT

We discuss some of the most popular learning rules that can be used to construct Neural Networks that act as associative memories. The Hebb’s rule, perceptron type algorithms and the projector rule with local versions are included.

1995 ◽  
Vol 7 (3) ◽  
pp. 507-517 ◽  
Author(s):  
Marco Idiart ◽  
Barry Berk ◽  
L. F. Abbott

Model neural networks can perform dimensional reductions of input data sets using correlation-based learning rules to adjust their weights. Simple Hebbian learning rules lead to an optimal reduction at the single unit level but result in highly redundant network representations. More complex rules designed to reduce or remove this redundancy can develop optimal principal component representations, but they are not very compelling from a biological perspective. Neurons in biological networks have restricted receptive fields limiting their access to the input data space. We find that, within this restricted receptive field architecture, simple correlation-based learning rules can produce surprisingly efficient reduced representations. When noise is present, the size of the receptive fields can be optimally tuned to maximize the accuracy of reconstructions of input data from a reduced representation.


2019 ◽  
Vol 33 (28) ◽  
pp. 1950343 ◽  
Author(s):  
Zhilian Yan ◽  
Youmei Zhou ◽  
Xia Huang ◽  
Jianping Zhou

This paper addresses the issue of finite-time boundedness for time-delay neural networks with external disturbances via weight learning. With the help of a group of inequalities and combining with the Lyapunov theory, weight learning rules are devised to ensure the neural networks to be finite-time bounded for the fixed connection weight matrix case and the fixed delayed connection weight matrix case, respectively. Sufficient conditions on the existence of the desired learning rules are presented in the form of linear matrix inequalities, which are easily verified by MATLAB software. It is shown that the proposed learning rules also guarantee the finite-time stability of the time-delay neural networks. Finally, a numerical example is employed to show the applicability of the devised weight learning rules.


1991 ◽  
Vol 22 (11) ◽  
pp. 71-90 ◽  
Author(s):  
James E. Moore ◽  
Moon Kim ◽  
Jong-Gook Seo ◽  
Ying Wu ◽  
Robert Kalaba

Sign in / Sign up

Export Citation Format

Share Document