scholarly journals Associative Models for Storing and Retrieving Concept Lattices

2010 ◽  
Vol 2010 ◽  
pp. 1-27 ◽  
Author(s):  
María Elena Acevedo ◽  
Cornelio Yáñez-Márquez ◽  
Marco Antonio Acevedo

Alpha-beta bidirectional associative memories are implemented for storing concept lattices. We use Lindig's algorithm to construct a concept lattice of a particular context; this structure is stored into an associative memory just as a human being does, namely, associating patterns. Bidirectionality and perfect recall of Alpha-Beta associative model make it a great tool to store a concept lattice. In the learning phase, objects and attributes obtained from Lindig's algorithm are associated by Alpha-Beta bidirectional associative memory; in this phase the data is stored. In the recalling phase, the associative model allows to retrieve objects from attributes or vice versa. Our model assures the recalling of every learnt concept.

2000 ◽  
Vol 12 (10) ◽  
pp. 2279-2290 ◽  
Author(s):  
Radim Bělohlávek

This article presents a concept interpretation of patterns for bidirectional associative memory (BAM) and a representation of hierarchical structures of concepts (concept lattices) by BAMs. The constructive representation theorem provides a storing rule for a training set that allows a concept interpretation. Examples demonstrating the theorems are presented.


2009 ◽  
Vol 2009 ◽  
pp. 1-14 ◽  
Author(s):  
María Elena Acevedo ◽  
Marco Antonio Acevedo ◽  
Federico Felipe

Bidirectional Associative Memories (BAMs) based on first model proposed by Kosko do not have perfect recall of training set, and their algorithm must iterate until it reaches a stable state. In this work, we use the model of Alpha-Beta BAM to classify automatically cancer recurrence in female patients with a previous breast cancer surgery. Alpha-Beta BAM presents perfect recall of all the training patterns and it has a one-shot algorithm; these advantages make to Alpha-Beta BAM a suitable tool for classification. We use data from Haberman database, and leave-one-out algorithm was applied to analyze the performance of our model as classifier. We obtain a percentage of classification of 99.98%.


2018 ◽  
Vol 3 (01) ◽  
Author(s):  
Sandeep Kumar ◽  
Manu Pratap Singh

Neural network is the most important model which has been studied in past decades by several researchers. Hopfield model is one of the network model proposed by J.J. Hopfield that describes the organization of neurons in such a way that they function as associative memory or also called content addressable memory. This is a recurrent network similar to recurrent layer of the hamming network but which can effectively perform the operation of both layer hamming network. The design of recurrent network has always been interesting problems to research and a lot of work is going on present application. In present paper we will discuss about the design of Hopfield Neural Network (HNNs), bidirectional associative memory (BAMs) and multidirectional associative memory (MAMs) for handwritten characters recognition. Recognized characters are Hindi alphabets.


2005 ◽  
Vol 17 (10) ◽  
pp. 2291-2300 ◽  
Author(s):  
Rohana Rajapakse ◽  
Michael Denham

Bidirectional associative memories (BAMs) are shown to be capable of precisely learning concept lattice structures by Radim Bělohlávek. The focus of this letter is to show that the BAM, when set up with a concept lattice by setting up connection weights according to the rule proposed by Bělohlávek, always returns the most specific or most generic concept containing the given set of objects or attributes when a set of objects or attributes is presented as input to the object or attribute layer. A proof of this property is given here, together with an example, and a brief application of the property is provided.


1997 ◽  
Vol 9 (2) ◽  
pp. 385-401 ◽  
Author(s):  
Chi Sing Leung ◽  
Lai Wan Chan

Forgetting learning is an incremental learning rule in associative memories. With it, the recent learning items can be encoded, and the old learning items will be forgotten. In this article, we analyze the storage behavior of bidirectional associative memory (BAM) under the forgetting learning. That is, “Can the most recent k learning item be stored as a fixed point?” Also, we discuss how to choose the forgetting constant in the forgetting learning such that the BAM can correctly store as many as possible of the most recent learning items. Simulation is provided to verify the theoretical analysis.


2021 ◽  
Author(s):  
Yingying Li ◽  
Junrui Li ◽  
Jie Li ◽  
Shukai Duan ◽  
Lidan Wang ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document