scholarly journals Order symmetry breaking and broad distribution of events in spiking neural networks with continuous membrane potential

2021 ◽  
Vol 147 ◽  
pp. 110946
Author(s):  
Marco Stucchi ◽  
Fabrizio Pittorino ◽  
Matteo di Volo ◽  
Alessandro Vezzani ◽  
Raffaella Burioni
Author(s):  
Tielin Zhang ◽  
Yi Zeng ◽  
Dongcheng Zhao ◽  
Bo Xu

Due to the nature of Spiking Neural Networks (SNNs), it is challenging to be trained by biologically plausible learning principles. The multi-layered SNNs are with non-differential neurons, temporary-centric synapses, which make them nearly impossible to be directly tuned by back propagation. Here we propose an alternative biological inspired balanced tuning approach to train SNNs. The approach contains three main inspirations from the brain: Firstly, the biological network will usually be trained towards the state where the temporal update of variables are equilibrium (e.g. membrane potential); Secondly, specific proportions of excitatory and inhibitory neurons usually contribute to stable representations; Thirdly, the short-term plasticity (STP) is a general principle to keep the input and output of synapses balanced towards a better learning convergence. With these inspirations, we train SNNs with three steps: Firstly, the SNN model is trained with three brain-inspired principles; then weakly supervised learning is used to tune the membrane potential in the final layer for network classification; finally the learned information is consolidated from membrane potential into the weights of synapses by Spike-Timing Dependent Plasticity (STDP). The proposed approach is verified on the MNIST hand-written digit recognition dataset and the performance (the accuracy of 98.64%) indicates that the ideas of balancing state could indeed improve the learning ability of SNNs, which shows the power of proposed brain-inspired approach on the tuning of biological plausible SNNs.


2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Sungmin Hwang ◽  
Jeesoo Chang ◽  
Min-Hye Oh ◽  
Jong-Ho Lee ◽  
Byung-Gook Park

2021 ◽  
Vol 15 ◽  
Author(s):  
Sungmin Hwang ◽  
Jeesoo Chang ◽  
Min-Hye Oh ◽  
Kyung Kyu Min ◽  
Taejin Jang ◽  
...  

Spiking neural networks (SNNs) have attracted many researchers’ interests due to its biological plausibility and event-driven characteristic. In particular, recently, many studies on high-performance SNNs comparable to the conventional analog-valued neural networks (ANNs) have been reported by converting weights trained from ANNs into SNNs. However, unlike ANNs, SNNs have an inherent latency that is required to reach the best performance because of differences in operations of neuron. In SNNs, not only spatial integration but also temporal integration exists, and the information is encoded by spike trains rather than values in ANNs. Therefore, it takes time to achieve a steady-state of the performance in SNNs. The latency is worse in deep networks and required to be reduced for the practical applications. In this work, we propose a pre-charged membrane potential (PCMP) for the latency reduction in SNN. A variety of neural network applications (e.g., classification, autoencoder using MNIST and CIFAR-10 datasets) are trained and converted to SNNs to demonstrate the effect of the proposed approach. The latency of SNNs is successfully reduced without accuracy loss. In addition, we propose a delayed evaluation method (DE), by which the errors during the initial transient are discarded. The error spikes occurring in the initial transient is removed by DE, resulting in the further latency reduction. DE can be used in combination with PCMP for further latency reduction. Finally, we also show the advantages of the proposed methods in improving the number of spikes required to reach a steady-state of the performance in SNNs for energy-efficient computing.


2012 ◽  
Vol 35 (12) ◽  
pp. 2633 ◽  
Author(s):  
Xiang-Hong LIN ◽  
Tian-Wen ZHANG ◽  
Gui-Cang ZHANG

2020 ◽  
Vol 121 ◽  
pp. 88-100 ◽  
Author(s):  
Jesus L. Lobo ◽  
Javier Del Ser ◽  
Albert Bifet ◽  
Nikola Kasabov

Sign in / Sign up

Export Citation Format

Share Document