Neural network associative memories with local coding

Author(s):  
Asieh Abolpour Mofrad ◽  
Zahra Ferdosi ◽  
Matthew G. Parker ◽  
Mohammad H. Tadayon
2016 ◽  
Vol 28 (8) ◽  
pp. 1553-1573 ◽  
Author(s):  
Asieh Abolpour Mofrad ◽  
Matthew G. Parker ◽  
Zahra Ferdosi ◽  
Mohammad H. Tadayon

Techniques from coding theory are able to improve the efficiency of neuroinspired and neural associative memories by forcing some construction and constraints on the network. In this letter, the approach is to embed coding techniques into neural associative memory in order to increase their performance in the presence of partial erasures. The motivation comes from recent work by Gripon, Berrou, and coauthors, which revisited Willshaw networks and presented a neural network with interacting neurons that partitioned into clusters. The model introduced stores patterns as small-size cliques that can be retrieved in spite of partial error. We focus on improving the success of retrieval by applying two techniques: doing a local coding in each cluster and then applying a precoding step. We use a slightly different decoding scheme, which is appropriate for partial erasures and converges faster. Although the ideas of local coding and precoding are not new, the way we apply them is different. Simulations show an increase in the pattern retrieval capacity for both techniques. Moreover, we use self-dual additive codes over field [Formula: see text], which have very interesting properties and a simple-graph representation.


2005 ◽  
Vol 15 (03) ◽  
pp. 181-196 ◽  
Author(s):  
CHEOLHWAN OH ◽  
STANISLAW H. ŻAK ◽  
GUISHENG ZHAI

A class of interconnected neural networks composed of generalized Brain-State-in-a-Box (gBSB) neural subnetworks is considered. Interconnected gBSB neural network architectures are proposed along with their stability conditions. The design of the interconnected neural networks is reduced to the problem of solving linear matrix inequalities (LMIs) to determine the interconnection parameters. A method for solving LMIs is devised generating the solutions that, in general, are further away from zero than the corresponding solutions obtained using MATLAB's LMI toolbox, thus resulting in stronger interconnections between the subnetworks. The proposed architectures are then used to construct neural associative memories. Simulations are performed to illustrate the results obtained.


Author(s):  
Donghoun Lee ◽  
Sehyun Tak ◽  
Sungjin Park ◽  
Hwasoo Yeo

In the intelligent transportation system field, there has been a growing interest in developing collision warning systems based on artificial neural network (ANN) techniques in an effort to address several issues associated with parametric approaches. Previous ANN-based collision warning algorithms were generally based on predetermined associative memories derived before driving. Because collision risk is highly related to the current traffic situation, such as traffic state transition from free flow to congestion, however, updating associative memory in real time should be considered. To improve further the performance of the warning system, a systemic architecture is proposed to implement the multilayer perceptron neural network–based rear-end collision warning system (MCWS), which updates the associative memory with the vehicle distance sensor and smartphone data in a cloud computing environment. For the practical use of the proposed MCWS, its collision warning accuracy is evaluated with respect to various time intervals for updating the associative memories and market penetration rates. Results show that the MCWS exhibits a steady improvement in its warning performance as the time interval decreases, whereas the MCWS works more efficiently as the sampling ratio increases overall. When the sampling ratio reaches 50%, the MCWS shows a particularly stable warning accuracy, regardless of the time interval. These findings suggest that the MCWS has great potential to provide an acceptable level of warning accuracy for practical use, as it can obtain the well-trained associative memories reflecting current traffic situations by using information from widespread smartphones.


Author(s):  
Chrisitian O. Pritz ◽  
Eyal Itskovits ◽  
Eduard Bokman ◽  
Rotem Ruach ◽  
Vladimir Gritsenko ◽  
...  

SummaryA major goal in neuroscience is to elucidate the principles by which memories are stored in a neural network. Here, we have systematically studied how the four types of associative memories (short- and long-term memories, each formed using positive and negative associations) are encoded within the compact neural network of C. elegans worms. Interestingly, short-term, but not long-term, memories are evident in the sensory system. Long-term memories are relegated to inner layers of the network, allowing the sensory system to resume innate functionality. Furthermore, a small set of sensory neurons is allocated for coding short-term memories, a design that can increase memory capacity and limit non-innate behavioral responses. Notably, individual sensory neurons may code for the conditioned stimulus or the experience valence. Interneurons integrate these information to modulate animal behavior upon memory reactivation. This comprehensive study reveals basic principles by which memories are encoded within a neural network, and highlights the central roles of sensory neurons in memory formation.


2013 ◽  
Vol 40 (1) ◽  
pp. 93-102 ◽  
Author(s):  
Masoud Baghelani ◽  
Afshin Ebrahimi ◽  
Habib Badri Ghavifekr

2017 ◽  
Vol 29 (6) ◽  
pp. 1681-1695 ◽  
Author(s):  
Asieh Abolpour Mofrad ◽  
Matthew G. Parker

Clique-based neural associative memories introduced by Gripon and Berrou (GB), have been shown to have good performance, and in our previous work we improved the learning capacity and retrieval rate by local coding and precoding in the presence of partial erasures. We now take a step forward and consider nested-clique graph structures for the network. The GB model stores patterns as small cliques, and we here replace these by nested cliques. Simulation results show that the nested-clique structure enhances the clique-based model.


Sign in / Sign up

Export Citation Format

Share Document