chaotic neuron
Recently Published Documents


TOTAL DOCUMENTS

64
(FIVE YEARS 3)

H-INDEX

11
(FIVE YEARS 0)

2021 ◽  
Vol 31 (01) ◽  
pp. 2130003
Author(s):  
Natsuhiro Ichinose

A model of quasiperiodic-chaotic neural networks is proposed on the basis of chaotic neural networks. A quasiperiodic-chaotic neuron exhibits quasiperiodic dynamics that an original chaotic neuron does not have. Quasiperiodic and chaotic solutions are exclusively isolated in the parameter space. The chaotic domain can be identified by the presence of a folding structure of an invariant closed curve. Using the property that the influence of perturbation is conserved in the quasiperiodic solution, we demonstrate short-term visual memory in which real numbers are acceptable for representing colors. The quasiperiodic solution is sensitive to dynamical noise when images are restored. However, the quasiperiodic synchronization among neurons can reduce the influence of noise. Short-term analog memory using quasiperiodicity is important in that it can directly store analog quantities. The quasiperiodic-chaotic neural networks are shown to work as large-scale analog storage arrays. This type of analog memory has potential applications to analog computation such as deep learning.


2017 ◽  
Vol 227 ◽  
pp. 108-112 ◽  
Author(s):  
Guowei Xu ◽  
Jixiang Xu ◽  
Chunbo Xiu ◽  
Fengnan Liu ◽  
Yakun Zang

2016 ◽  
Vol 136 (10) ◽  
pp. 1424-1430 ◽  
Author(s):  
Yoshiki Sasaki ◽  
Katsutoshi Saeki ◽  
Yoshifumi Sekine

2015 ◽  
Vol 109 (2) ◽  
pp. 20002 ◽  
Author(s):  
R. Barrio ◽  
M. Lefranc ◽  
M. A. Martínez ◽  
S. Serrano
Keyword(s):  

2015 ◽  
Vol 64 (6) ◽  
pp. 060504
Author(s):  
Xiu Chun-Bo ◽  
Liu Chang ◽  
Guo Fu-Hui ◽  
Cheng Yi ◽  
Luo Jing

2014 ◽  
Vol 2 (4) ◽  
pp. 72-84 ◽  
Author(s):  
Toshinori Deguchi ◽  
Toshiki Takahashi ◽  
Naohiro Ishii

The incremental learning is a method to compose an associate memory using a chaotic neural network and provides larger capacity than correlative learning in compensation for a large amount of computation. A chaotic neuron has spatiotemporal summation in it and the temporal summation makes the learning stable to input noise. When there is no noise in input, the neuron may not need temporal summation. In this paper, to reduce the computations, a simplified network without temporal summation is introduced and investigated through the computer simulations comparing with the network as in the past, which is called here the usual network. It turns out that the simplified network has the same capacity in comparison with the usual network and can learn faster than the usual one, but that the simplified network loses the learning ability in noisy inputs. To improve this ability, the parameters in the chaotic neural network are adjusted.


Sign in / Sign up

Export Citation Format

Share Document