Towards development of self-learning and self-modification spiking neural network as model of brain

Author(s):  
Aleksandr A. Maliavko ◽  
Andrey V. Gavrilov
2020 ◽  
Vol 14 ◽  
Author(s):  
Sergey A. Lobov ◽  
Alexey N. Mikhaylov ◽  
Maxim Shamshin ◽  
Valeri A. Makarov ◽  
Victor B. Kazantsev

Author(s):  
Yevgeniy Bodyanskiy ◽  
Artem Dolotov

A Multilayered Self-Learning Spiking Neural Network and its Learning Algorithm Based on ‘Winner-Takes-More’ Rule in Hierarchical ClusteringThis paper introduces architecture of multilayered selflearning spiking neural network for hierarchical data clustering. It consists of the layer of population coding and several layers of spiking neurons. Contrary to originally suggested multilayered spiking neural network, the proposed one does not require a separate learning algorithm for lateral connections. Irregular clusters detecting capability is achieved by improving the temporal Hebbian learning algorithm. It is generalized by replacing ‘Winner-Takes-All’ rule with ‘Winner-Takes-More’ one. It is shown that the layer of receptive neurons can be treated as a fuzzification layer where pool of receptive neurons is a linguistic variable, and receptive neuron within a pool is a linguistic term. The network architecture is designed in terms of control systems theory. Using the Laplace transform notion, spiking neuron synapse is presented as a second-order critically damped response unit. Spiking neuron soma is modeled on the basis of bang-bang control systems theory as a threshold detection system. Simulation experiment confirms that the proposed architecture is effective in detecting irregular clusters.


2018 ◽  
Vol 145 ◽  
pp. 488-494 ◽  
Author(s):  
Aleksandr Sboev ◽  
Alexey Serenko ◽  
Roman Rybka ◽  
Danila Vlasov ◽  
Andrey Filchenkov

2021 ◽  
Vol 1914 (1) ◽  
pp. 012036
Author(s):  
LI Wei ◽  
Zhu Wei-gang ◽  
Pang Hong-feng ◽  
Zhao Hong-yu

Sign in / Sign up

Export Citation Format

Share Document