M-ary Hopfield Neural Network Based Associative Memory Formulation: Limit-Cycle Based Sequence Storage and Retrieval

2021 ◽  
pp. 420-432
Author(s):  
Vandana M. Ladwani ◽  
V. Ramasubramanian
2021 ◽  
pp. 1-30
Author(s):  
Asieh Abolpour Mofrad ◽  
Samaneh Abolpour Mofrad ◽  
Anis Yazidi ◽  
Matthew Geoffrey Parker

Abstract Associative memories enjoy many interesting properties in terms of error correction capabilities, robustness to noise, storage capacity, and retrieval performance, and their usage spans over a large set of applications. In this letter, we investigate and extend tournament-based neural networks, originally proposed by Jiang, Gripon, Berrou, and Rabbat (2016), a novel sequence storage associative memory architecture with high memory efficiency and accurate sequence retrieval. We propose a more general method for learning the sequences, which we call feedback tournament-based neural networks. The retrieval process is also extended to both directions: forward and backward—in other words, any large-enough segment of a sequence can produce the whole sequence. Furthermore, two retrieval algorithms, cache-winner and explore-winner, are introduced to increase the retrieval performance. Through simulation results, we shed light on the strengths and weaknesses of each algorithm.


2012 ◽  
Vol 18 (3) ◽  
pp. 279-296 ◽  
Author(s):  
Emad I Abdul Kareem ◽  
Wafaa A.H Ali Alsalihy ◽  
Aman Jantan

2013 ◽  
Vol 3 (1) ◽  
pp. 83-93
Author(s):  
Rajesh Lavania ◽  
Manu Pratap Singh

In this paper we are performing the evaluation of Hopfield neural network as Associative memory for recalling of memorized patterns from the Sub-optimal genetic algorithm for Handwritten of Hindi language. In this process the genetic algorithm is employed from sub-optimal form for recalling of memorized patterns corresponding to the presented noisy prototype input patterns. The sub-optimal form of GA is considered as the non-random initial population or solution. So, rather than random start, the GA explores from the sum of correlated weight matrices for the input patterns of training set. The objective of this study is to determine the optimal weight matrix for correct recalling corresponds to approximate prototype input pattern of Hindi ‘SW. In this study the performance of neural network is evaluated in terms of the rate of success for recalling of memorized Hindi  for presented approximate prototype input pattern with GA in two aspects. The first aspect reflects the random nature of the GA and the second one exhibit the suboptimal nature of the GA for its exploration.The simulated results demonstrate the better performance of network for recalling of the memorized Hindi SWARS using genetic algorithm to evolve the population of weights from sub-optimal weight matrix. 


2018 ◽  
Vol 3 (01) ◽  
Author(s):  
Sandeep Kumar ◽  
Manu Pratap Singh

Neural network is the most important model which has been studied in past decades by several researchers. Hopfield model is one of the network model proposed by J.J. Hopfield that describes the organization of neurons in such a way that they function as associative memory or also called content addressable memory. This is a recurrent network similar to recurrent layer of the hamming network but which can effectively perform the operation of both layer hamming network. The design of recurrent network has always been interesting problems to research and a lot of work is going on present application. In present paper we will discuss about the design of Hopfield Neural Network (HNNs), bidirectional associative memory (BAMs) and multidirectional associative memory (MAMs) for handwritten characters recognition. Recognized characters are Hindi alphabets.


Author(s):  
Vishwanathan Mohan ◽  
◽  
Yashwant V. Joshi ◽  
Anand Itagi ◽  
Garipelli Gangadhar

It is argued that weight adaptations even during retrieval phase can greatly enhance the performance of a neurodynamic associative memory. Our simulations with an electronic implementation of an associative memory showed that extending the Hopfield dynamics with an appropriate adaptive law in retrieval phase could give rise to significant improvements in storage capacity and computational reliability. Weights, which are supposed to encode the information stored in the Hopfield neural network, are usually held constant once training/storage is complete. In our case, weights also change during retrieval, hence losing information in the process, but resulting in much better retrieval of stored patterns. We describe and characterize the functional elements comprising the network, the learning system, and include the experimental results obtained from applying the network for character recognition in various noisy conditions. Stability issues emerging as a consequence of retrieval phase weight adaptation and implications of weights being used as transitory, intermediary variables are briefly discussed.


2008 ◽  
Vol 18 (02) ◽  
pp. 135-145 ◽  
Author(s):  
TEIJIRO ISOKAWA ◽  
HARUHIKO NISHIMURA ◽  
NAOTAKE KAMIURA ◽  
NOBUYUKI MATSUI

Associative memory networks based on quaternionic Hopfield neural network are investigated in this paper. These networks are composed of quaternionic neurons, and input, output, threshold, and connection weights are represented in quaternions, which is a class of hypercomplex number systems. The energy function of the network and the Hebbian rule for embedding patterns are introduced. The stable states and their basins are explored for the networks with three neurons and four neurons. It is clarified that there exist at most 16 stable states, called multiplet components, as the degenerated stored patterns, and each of these states has its basin in the quaternionic networks.


IEEE Access ◽  
2019 ◽  
Vol 7 ◽  
pp. 58876-58882 ◽  
Author(s):  
Deyu Kong ◽  
Shaogang Hu ◽  
Junjie Wang ◽  
Zhen Liu ◽  
Tupei Chen ◽  
...  

2015 ◽  
Vol 6 (1) ◽  
Author(s):  
S.G. Hu ◽  
Y. Liu ◽  
Z Liu ◽  
T.P. Chen ◽  
J.J. Wang ◽  
...  

2011 ◽  
Vol 225-226 ◽  
pp. 479-482
Author(s):  
Min Xia ◽  
Ying Cao Zhang ◽  
Xiao Ling Ye

Nonlinear function constitution and dynamic synapses, against spurious state for Hopfield neural network are proposed. The model of the dynamical connection weight and the updating scheme of the states of neurons are given. Nonlinear function constitution improves the conventional Hebbian learning rule with linear outer product method. Simulation results show that both nonlinear function constitution and dynamic synapses can effectively increase the ability of error tolerance; furthermore, associative memory of neural network with the new method can both enlarge attractive basin and increase storage capacity.


Sign in / Sign up

Export Citation Format

Share Document