On Capacity with Incremental Learning by Simplified Chaotic Neural Network

Author(s):  
Toshinori Deguchi ◽  
Naohiro Ishii
2014 ◽  
Vol 2 (4) ◽  
pp. 72-84 ◽  
Author(s):  
Toshinori Deguchi ◽  
Toshiki Takahashi ◽  
Naohiro Ishii

The incremental learning is a method to compose an associate memory using a chaotic neural network and provides larger capacity than correlative learning in compensation for a large amount of computation. A chaotic neuron has spatiotemporal summation in it and the temporal summation makes the learning stable to input noise. When there is no noise in input, the neuron may not need temporal summation. In this paper, to reduce the computations, a simplified network without temporal summation is introduced and investigated through the computer simulations comparing with the network as in the past, which is called here the usual network. It turns out that the simplified network has the same capacity in comparison with the usual network and can learn faster than the usual one, but that the simplified network loses the learning ability in noisy inputs. To improve this ability, the parameters in the chaotic neural network are adjusted.


2013 ◽  
Vol 12 (12) ◽  
pp. 2292-2299 ◽  
Author(s):  
Zhe-min LI ◽  
Li-guo CUI ◽  
Shi-wei XU ◽  
Ling-yun WENG ◽  
Xiao-xia DONG ◽  
...  

2015 ◽  
Vol 166 ◽  
pp. 487-495 ◽  
Author(s):  
Xinli Shi ◽  
Shukai Duan ◽  
Lidan Wang ◽  
Tingwen Huang ◽  
Chuandong Li

Sign in / Sign up

Export Citation Format

Share Document