scholarly journals AN IMPROVED ARCHITECTURE FOR COMPETITIVE AND COOPERATIVE NEURONS (CCNS) IN NEURAL NETWORKS

2014 ◽  
pp. 8-15
Author(s):  
M. Kamrul Islam

In neural networks, the associative memory is one in which applying some input pattern leads to the response of a corresponding stored pattern. During the learning phase the memory is fed with a number of input vectors and in the recall phase when some known input is presented to it, the network recalls and reproduces the output vector. Here, we improve and increase the storing ability of the memory model proposed in [1]. We show that there are certain instances where their algorithm can not produce the desired performance by retrieving exactly the correct vector. That is, in their algorithm, a number of output vectors can become activated from the stimulus of an input vector while the desired output is just a single vector. Our proposed solution overcomes this and uniquely determines the output vector as some input vector is applied. Thus we provide a more general scenario of this neural network memory model consisting of Competitive Cooperative Neurons (CCNs).

Author(s):  
Roberto A. Vazquez ◽  
Humberto Sossa

An associative memory AM is a special kind of neural network that allows recalling one output pattern given an input pattern as a key that might be altered by some kind of noise (additive, subtractive or mixed). Most of these models have several constraints that limit their applicability in complex problems such as face recognition (FR) and 3D object recognition (3DOR). Despite of the power of these approaches, they cannot reach their full power without applying new mechanisms based on current and future study of biological neural networks. In this direction, we would like to present a brief summary concerning a new associative model based on some neurobiological aspects of human brain. In addition, we would like to describe how this dynamic associative memory (DAM), combined with some aspects of infant vision system, could be applied to solve some of the most important problems of pattern recognition: FR and 3DOR.


Author(s):  
Luis F. de Mingo ◽  
Nuria Gómez ◽  
Fernando Arroyo ◽  
Juan Castellanos

This article presents a neural network model that permits to build a conceptual hierarchy to approximate functions over a given interval. Bio-inspired axo-axonic connections are used. In these connections the signal weight between two neurons is computed by the output of other neuron. Such arquitecture can generate polynomial expressions with lineal activation functions. This network can approximate any pattern set with a polynomial equation. This neural system classifies an input pattern as an element belonging to a category that the system has, until an exhaustive classification is obtained. The proposed model is not a hierarchy of neural networks, it establishes relationships among all the different neural networks in order to propagate the activation. Each neural network is in charge of the input pattern recognition to any prototyped category, and also in charge of transmitting the activation to other neural networks to be able to continue with the approximation.


2016 ◽  
Vol 186 ◽  
pp. 44-53 ◽  
Author(s):  
Caigen Zhou ◽  
Xiaoqin Zeng ◽  
Jianjiang Yu ◽  
Haibo Jiang

2014 ◽  
pp. 32-37
Author(s):  
Akira Imada

We are exploring a weight configuration space searching for solutions to make our neural network with spiking neurons do some tasks. For the task of simulating an associative memory model, we have already known one such solution — a weight configuration learned a set of patterns using Hebb’s rule, and we guess we have many others which we have not known so far. In searching for such solutions, we observed that the so-called fitness landscape was almost everywhere completely flatland of altitude zero in which the Hebbian weight configuration is the only unique peak, and in addition, the sidewall of the peak is not gradient at all. In such circumstances how could we search for the other peaks? This paper is a call for challenges to the problem.


2013 ◽  
Vol 3 (1) ◽  
pp. 83-93
Author(s):  
Rajesh Lavania ◽  
Manu Pratap Singh

In this paper we are performing the evaluation of Hopfield neural network as Associative memory for recalling of memorized patterns from the Sub-optimal genetic algorithm for Handwritten of Hindi language. In this process the genetic algorithm is employed from sub-optimal form for recalling of memorized patterns corresponding to the presented noisy prototype input patterns. The sub-optimal form of GA is considered as the non-random initial population or solution. So, rather than random start, the GA explores from the sum of correlated weight matrices for the input patterns of training set. The objective of this study is to determine the optimal weight matrix for correct recalling corresponds to approximate prototype input pattern of Hindi ‘SW. In this study the performance of neural network is evaluated in terms of the rate of success for recalling of memorized Hindi  for presented approximate prototype input pattern with GA in two aspects. The first aspect reflects the random nature of the GA and the second one exhibit the suboptimal nature of the GA for its exploration.The simulated results demonstrate the better performance of network for recalling of the memorized Hindi SWARS using genetic algorithm to evolve the population of weights from sub-optimal weight matrix. 


2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Hui Chu

As a human brain-like computational model that can reflect the cognitive function of the brain, the problem of dynamic analysis of associative memory neural networks has attracted the attention of scholars. This paper combines associative memory neural networks with enterprise financial management risks, studies the synchronization control and stability analysis problems of unidirectional associative memory-like human brain amnestic neural networks with perturbation and mixed time-varying time lags, proposes a bidirectional associative memory-like brain stochastic amnestic neural network model with mixed time-varying time lags, designs a discrete-time sampling control strategy based on the model, and studies various types of recent financial risks. Based on the early warning research, based on the associative memory neural network method, we propose to reconstruct the risk categories, including improving the enterprise risk management system, enhancing the awareness of financial risk management from top to bottom, and strengthening the core competitiveness of the enterprise itself and control measures for financing, investment, operation, and cash flow risks.


1992 ◽  
Vol 03 (04) ◽  
pp. 389-393 ◽  
Author(s):  
MÁRCIA M. OCHI ◽  
O.L.T. DE MENEZES

We study the associative memory properties for a particular structure of neural network. It is assumed a low connectivity between two highly connected network regions. As process of common learning correlates the subpatterns stored in each region. The retrieval of the global pattern is studied as a function of the overlap between the associated subpatterns.


1988 ◽  
Vol 13 (1) ◽  
pp. 74 ◽  
Author(s):  
Sang-Hoon Oh ◽  
Tae-Hoon Yoon ◽  
Jae Chang Kim

Sign in / Sign up

Export Citation Format

Share Document