A new memory model for humanoid robots - introduction of associative memory using chaotic neural network

Author(s):  
K. Itoh ◽  
H. Miwa ◽  
Y. Nukariya ◽  
H. Takanobu ◽  
A. Takanishi
2005 ◽  
Vol 18 (5-6) ◽  
pp. 666-673 ◽  
Author(s):  
Kazuko Itoh ◽  
Hiroyasu Miwa ◽  
Hideaki Takanobu ◽  
Atsuo Takanishi

2012 ◽  
Vol 2012 ◽  
pp. 1-19 ◽  
Author(s):  
Xiaofang Hu ◽  
Shukai Duan ◽  
Lidan Wang

Chaotic Neural Network, also denoted by the acronym CNN, has rich dynamical behaviors that can be harnessed in promising engineering applications. However, due to its complex synapse learning rules and network structure, it is difficult to update its synaptic weights quickly and implement its large scale physical circuit. This paper addresses an implementation scheme of a novel CNN with memristive neural synapses that may provide a feasible solution for further development of CNN. Memristor, widely known as the fourth fundamental circuit element, was theoretically predicted by Chua in 1971 and has been developed in 2008 by the researchers in Hewlett-Packard Laboratory. Memristor based hybrid nanoscale CMOS technology is expected to revolutionize the digital and neuromorphic computation. The proposed memristive CNN has four significant features: (1) nanoscale memristors can simplify the synaptic circuit greatly and enable the synaptic weights update easily; (2) it can separate stored patterns from superimposed input; (3) it can deal with one-to-many associative memory; (4) it can deal with many-to-many associative memory. Simulation results are provided to illustrate the effectiveness of the proposed scheme.


1999 ◽  
Vol 16 (2) ◽  
pp. 130-137
Author(s):  
Yifeng Zhang ◽  
Luxi Yang ◽  
Zhenya He

2008 ◽  
Vol 71 (13-15) ◽  
pp. 2794-2805 ◽  
Author(s):  
Guoguang He ◽  
Luonan Chen ◽  
Kazuyuki Aihara

2014 ◽  
pp. 32-37
Author(s):  
Akira Imada

We are exploring a weight configuration space searching for solutions to make our neural network with spiking neurons do some tasks. For the task of simulating an associative memory model, we have already known one such solution — a weight configuration learned a set of patterns using Hebb’s rule, and we guess we have many others which we have not known so far. In searching for such solutions, we observed that the so-called fitness landscape was almost everywhere completely flatland of altitude zero in which the Hebbian weight configuration is the only unique peak, and in addition, the sidewall of the peak is not gradient at all. In such circumstances how could we search for the other peaks? This paper is a call for challenges to the problem.


Sign in / Sign up

Export Citation Format

Share Document