scholarly journals Implementation of an Associative Memory using a Restricted Hopfield Network

Author(s):  
Tet Yeap

A trainable analog restricted Hopfield Network is presented in this paper. It consists of two layers of nodes, visible and hidden nodes, connected by weighted directional paths forming a bipartite graph with no intralayer connection. An energy or Lyapunov function was derived to show that the proposed network will converge to stable states. The proposed network can be trained using either the modified SPSA or BPTT algorithms to ensure that all the weights are symmetric. Simulation results show that the presence of hidden nodes increases the network’s memory capacity. Using EXOR as an example, the network can be trained to be a dynamic classifier. Using A, U, T, S as training characters, the network was trained to be an associative memory. Simulation results show that the network can perform perfect re-creation of noisy images. Its recreation performance has higher noise tolerance than the standard Hopfield Network and the Restricted Boltzmann Machine. Simulation results also illustrate the importance of feedback iteration in implementing associative memory to re-create from noisy images.

1999 ◽  
Vol 09 (04) ◽  
pp. 351-370 ◽  
Author(s):  
M. SREENIVASA RAO ◽  
ARUN K. PUJARI

A new paradigm of neural network architecture is proposed that works as associative memory along with capabilities of pruning and order-sensitive learning. The network has a composite structure wherein each node of the network is a Hopfield network by itself. The Hopfield network employs an order-sensitive learning technique and converges to user-specified stable states without having any spurious states. This is based on geometrical structure of the network and of the energy function. The network is so designed that it allows pruning in binary order as it progressively carries out associative memory retrieval. The capacity of the network is 2n, where n is the number of basic nodes in the network. The capabilities of the network are demonstrated by experimenting on three different application areas, namely a Library Database, a Protein Structure Database and Natural Language Understanding.


2016 ◽  
pp. 57-86
Author(s):  
Hiromi Miyajima ◽  
Shuji Yatsuki ◽  
Noritaka Shigei ◽  
Hirofumi Miyajima

Higher order neural networks (HONNs) have been proposed as new systems. In this paper, we show some theoretical results of associative capability of HONNs. As one of them, memory capacity of HONNs is much larger than one of the conventional neural networks. Further, we show some theoretical results on homogeneous higher order neural networks (HHONNs), in which each neuron has identical weights. HHONNs can realize shift-invariant associative memory, that is, HHONNs can associate not only a memorized pattern but also its shifted ones.


2014 ◽  
Vol 1079-1080 ◽  
pp. 207-211
Author(s):  
Min He ◽  
Rui Guang Hu ◽  
Shi Le ◽  
Liang Chen

Inthis paper, according to the more important ten evaluation indicators, the fourgrades ideal evaluation is established corresponding to the level of healthstate of bridges. Combined with associative memory capacity of discreteHopfield neural networks, a new health state evaluation of bridges ispresented. Five bridges is evaluated by the model, the network connectionweights is obtained by iterative learning using the outer product method. Thesimulation results shows that the health evaluation model can evaluate thehealth state of bridges fast, accurately and intuitively.


2013 ◽  
Vol 341-342 ◽  
pp. 333-336
Author(s):  
Ming Zhen Hu ◽  
Bo Zeng Wu ◽  
Jin Quan Chen ◽  
Ji Shu Zeng

For flotation characteristics of complex sulfide mineral of low-tin in Guangxi Dachang mine, fluid dynamics software FLUENT was applied to simulate the turbulence intensity of slurry fluid in flotation machine at different inflation pressures. The effect of flow field characteristics was gotten for flotation machine. Simulation results show that the best inflation pressure was 120000 Pa.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Nathan Eli Miller ◽  
Saibal Mukhopadhyay

AbstractIn this work, we present a Quantum Hopfield Associative Memory (QHAM) and demonstrate its capabilities in simulation and hardware using IBM Quantum Experience.. The QHAM is based on a quantum neuron design which can be utilized for many different machine learning applications and can be implemented on real quantum hardware without requiring mid-circuit measurement or reset operations. We analyze the accuracy of the neuron and the full QHAM considering hardware errors via simulation with hardware noise models as well as with implementation on the 15-qubit ibmq_16_melbourne device. The quantum neuron and the QHAM are shown to be resilient to noise and require low qubit overhead and gate complexity. We benchmark the QHAM by testing its effective memory capacity and demonstrate its capabilities in the NISQ-era of quantum hardware. This demonstration of the first functional QHAM to be implemented in NISQ-era quantum hardware is a significant step in machine learning at the leading edge of quantum computing.


Entropy ◽  
2019 ◽  
Vol 21 (8) ◽  
pp. 726 ◽  
Author(s):  
Giorgio Gosti ◽  
Viola Folli ◽  
Marco Leonetti ◽  
Giancarlo Ruocco

In a neural network, an autapse is a particular kind of synapse that links a neuron onto itself. Autapses are almost always not allowed neither in artificial nor in biological neural networks. Moreover, redundant or similar stored states tend to interact destructively. This paper shows how autapses together with stable state redundancy can improve the storage capacity of a recurrent neural network. Recent research shows how, in an N-node Hopfield neural network with autapses, the number of stored patterns (P) is not limited to the well known bound 0.14 N , as it is for networks without autapses. More precisely, it describes how, as the number of stored patterns increases well over the 0.14 N threshold, for P much greater than N, the retrieval error asymptotically approaches a value below the unit. Consequently, the reduction of retrieval errors allows a number of stored memories, which largely exceeds what was previously considered possible. Unfortunately, soon after, new results showed that, in the thermodynamic limit, given a network with autapses in this high-storage regime, the basin of attraction of the stored memories shrinks to a single state. This means that, for each stable state associated with a stored memory, even a single bit error in the initial pattern would lead the system to a stationary state associated with a different memory state. This thus limits the potential use of this kind of Hopfield network as an associative memory. This paper presents a strategy to overcome this limitation by improving the error correcting characteristics of the Hopfield neural network. The proposed strategy allows us to form what we call an absorbing-neighborhood of state surrounding each stored memory. An absorbing-neighborhood is a set defined by a Hamming distance surrounding a network state, which is an absorbing because, in the long-time limit, states inside it are absorbed by stable states in the set. We show that this strategy allows the network to store an exponential number of memory patterns, each surrounded with an absorbing-neighborhood with an exponentially growing size.


Sign in / Sign up

Export Citation Format

Share Document