A Low-Power Hardware Architecture for On-Line Supervised Learning in Multi-Layer Spiking Neural Networks

Author(s):  
Nan Zheng ◽  
Pinaki Mazumder
2014 ◽  
Vol 144 ◽  
pp. 526-536 ◽  
Author(s):  
Jinling Wang ◽  
Ammar Belatreche ◽  
Liam Maguire ◽  
Thomas Martin McGinnity

2021 ◽  
Vol 2021 ◽  
pp. 1-16
Author(s):  
Xianghong Lin ◽  
Mengwei Zhang ◽  
Xiangwen Wang

As a new brain-inspired computational model of artificial neural networks, spiking neural networks transmit and process information via precisely timed spike trains. Constructing efficient learning methods is a significant research field in spiking neural networks. In this paper, we present a supervised learning algorithm for multilayer feedforward spiking neural networks; all neurons can fire multiple spikes in all layers. The feedforward network consists of spiking neurons governed by biologically plausible long-term memory spike response model, in which the effect of earlier spikes on the refractoriness is not neglected to incorporate adaptation effects. The gradient descent method is employed to derive synaptic weight updating rule for learning spike trains. The proposed algorithm is tested and verified on spatiotemporal pattern learning problems, including a set of spike train learning tasks and nonlinear pattern classification problems on four UCI datasets. Simulation results indicate that the proposed algorithm can improve learning accuracy in comparison with other supervised learning algorithms.


Sign in / Sign up

Export Citation Format

Share Document