Saturation Level of the Hopfield Model for Neural Network

1986 ◽  
Vol 2 (4) ◽  
pp. 337-341 ◽  
Author(s):  
A Crisanti ◽  
D. J Amit ◽  
H Gutfreund
2007 ◽  
Vol 19 (4) ◽  
pp. 956-973 ◽  
Author(s):  
D. Dominguez ◽  
K. Koroutchev ◽  
E. Serrano ◽  
F. B. Rodríguez

A wide range of networks, including those with small-world topology, can be modeled by the connectivity ratio and randomness of the links. Both learning and attractor abilities of a neural network can be measured by the mutual information (MI) as a function of the load and the overlap between patterns and retrieval states. In this letter, we use MI to search for the optimal topology with regard to the storage and attractor properties of the network in an Amari-Hopfield model. We find that while an optimal storage implies an extremely diluted topology, a large basin of attraction leads to moderate levels of connectivity. This optimal topology is related to the clustering and path length of the network. We also build a diagram for the dynamical phases with random or local initial overlap and show that very diluted networks lose their attractor ability.


1992 ◽  
Vol 03 (supp01) ◽  
pp. 303-308
Author(s):  
Giuseppe Barbagli ◽  
Guido Castellini ◽  
Gregorio Landi ◽  
Stefano Vettori

We have investigated the problem of track finding with a recurrent neural network algorithm based on the Hopfield model and considered the possibility of a hardware implementation with DSP’s. Starting from a set of signal points we define track segments and set a cut on the length to keep the size of the network reasonable. Those segments surviving the cut are associated to neurons. A geometric coupling of neighbouring segments is used to select smooth combinations of them. Given random initial conditions the network converges to a solution. The method may be applied to a variety of curves.


1999 ◽  
Vol 28 (2) ◽  
pp. 89-96
Author(s):  
R. Ramachandran ◽  
N. Gunasekaran

2001 ◽  
Vol 6 (2) ◽  
pp. 129-136 ◽  
Author(s):  
Jiyang Dong ◽  
Shenchu Xu ◽  
Zhenxiang Chen ◽  
Boxi Wu

Discrete Hopfield neural network (DHNN) is studied by performing permutation operations on the synaptic weight matrix. The storable patterns set stored with Hebbian learning algorithm in a network without losing memories is studied, and a condition which makes sure all the patterns of the storable patterns set have a same basin size of attraction is proposed. Then, the permutation symmetries of the network are studied associating with the stored patterns set. A construction of the storable patterns set satisfying that condition is achieved by consideration of their invariance under a point group.


Sign in / Sign up

Export Citation Format

Share Document