A fast learning algorithm of self-learning spiking neural network

Author(s):  
Yevgeniy Bodyanskiy ◽  
Artem Dolotov ◽  
Iryna Pliss ◽  
Mykola Malyar
Author(s):  
Yevgeniy Bodyanskiy ◽  
Artem Dolotov

A Multilayered Self-Learning Spiking Neural Network and its Learning Algorithm Based on ‘Winner-Takes-More’ Rule in Hierarchical ClusteringThis paper introduces architecture of multilayered selflearning spiking neural network for hierarchical data clustering. It consists of the layer of population coding and several layers of spiking neurons. Contrary to originally suggested multilayered spiking neural network, the proposed one does not require a separate learning algorithm for lateral connections. Irregular clusters detecting capability is achieved by improving the temporal Hebbian learning algorithm. It is generalized by replacing ‘Winner-Takes-All’ rule with ‘Winner-Takes-More’ one. It is shown that the layer of receptive neurons can be treated as a fuzzification layer where pool of receptive neurons is a linguistic variable, and receptive neuron within a pool is a linguistic term. The network architecture is designed in terms of control systems theory. Using the Laplace transform notion, spiking neuron synapse is presented as a second-order critically damped response unit. Spiking neuron soma is modeled on the basis of bang-bang control systems theory as a threshold detection system. Simulation experiment confirms that the proposed architecture is effective in detecting irregular clusters.


Sensors ◽  
2021 ◽  
Vol 21 (9) ◽  
pp. 3276
Author(s):  
Szymon Szczęsny ◽  
Damian Huderek ◽  
Łukasz Przyborowski

The paper describes the architecture of a Spiking Neural Network (SNN) for time waveform analyses using edge computing. The network model was based on the principles of preprocessing signals in the diencephalon and using tonic spiking and inhibition-induced spiking models typical for the thalamus area. The research focused on a significant reduction of the complexity of the SNN algorithm by eliminating most synaptic connections and ensuring zero dispersion of weight values concerning connections between neuron layers. The paper describes a network mapping and learning algorithm, in which the number of variables in the learning process is linearly dependent on the size of the patterns. The works included testing the stability of the accuracy parameter for various network sizes. The described approach used the ability of spiking neurons to process currents of less than 100 pA, typical of amperometric techniques. An example of a practical application is an analysis of vesicle fusion signals using an amperometric system based on Carbon NanoTube (CNT) sensors. The paper concludes with a discussion of the costs of implementing the network as a semiconductor structure.


IEEE Access ◽  
2020 ◽  
Vol 8 ◽  
pp. 216922-216932
Author(s):  
Giseok Kim ◽  
Kiryong Kim ◽  
Sara Choi ◽  
Hyo Jung Jang ◽  
Seong-Ook Jung

Author(s):  
Qingsong Xu

Extreme learning machine (ELM) is a learning algorithm for single-hidden layer feedforward neural networks. In theory, this algorithm is able to provide good generalization capability at extremely fast learning speed. Comparative studies of benchmark function approximation problems revealed that ELM can learn thousands of times faster than conventional neural network (NN) and can produce good generalization performance in most cases. Unfortunately, the research on damage localization using ELM is limited in the literature. In this chapter, the ELM is extended to the domain of damage localization of plate structures. Its effectiveness in comparison with typical neural networks such as back-propagation neural network (BPNN) and least squares support vector machine (LSSVM) is illustrated through experimental studies. Comparative investigations in terms of learning time and localization accuracy are carried out in detail. It is shown that ELM paves a new way in the domain of plate structure health monitoring. Both advantages and disadvantages of using ELM are discussed.


Sign in / Sign up

Export Citation Format

Share Document