scholarly journals Quantized Weight Transfer Method Using Spike-Timing-Dependent Plasticity for Hardware Spiking Neural Network

2021 ◽  
Vol 11 (5) ◽  
pp. 2059
Author(s):  
Sungmin Hwang ◽  
Hyungjin Kim ◽  
Byung-Gook Park

A hardware-based spiking neural network (SNN) has attracted many researcher’s attention due to its energy-efficiency. When implementing the hardware-based SNN, offline training is most commonly used by which trained weights by a software-based artificial neural network (ANN) are transferred to synaptic devices. However, it is time-consuming to map all the synaptic weights as the scale of the neural network increases. In this paper, we propose a method for quantized weight transfer using spike-timing-dependent plasticity (STDP) for hardware-based SNN. STDP is an online learning algorithm for SNN, but we utilize it as the weight transfer method. Firstly, we train SNN using the Modified National Institute of Standards and Technology (MNIST) dataset and perform weight quantization. Next, the quantized weights are mapped to the synaptic devices using STDP, by which all the synaptic weights connected to a neuron are transferred simultaneously, reducing the number of pulse steps. The performance of the proposed method is confirmed, and it is demonstrated that there is little reduction in the accuracy at more than a certain level of quantization, but the number of pulse steps for weight transfer substantially decreased. In addition, the effect of the device variation is verified.

2021 ◽  
Vol 15 ◽  
Author(s):  
Biswadeep Chakraborty ◽  
Saibal Mukhopadhyay

A Spiking Neural Network (SNN) is trained with Spike Timing Dependent Plasticity (STDP), which is a neuro-inspired unsupervised learning method for various machine learning applications. This paper studies the generalizability properties of the STDP learning processes using the Hausdorff dimension of the trajectories of the learning algorithm. The paper analyzes the effects of STDP learning models and associated hyper-parameters on the generalizability properties of an SNN. The analysis is used to develop a Bayesian optimization approach to optimize the hyper-parameters for an STDP model for improving the generalizability properties of an SNN.


2021 ◽  
Vol 12 (03) ◽  
pp. 25-33
Author(s):  
Mario Antoine Aoun

We compare the number of states of a Spiking Neural Network (SNN) composed from chaotic spiking neurons versus the number of states of a SNN composed from regular spiking neurons while both SNNs implementing a Spike Timing Dependent Plasticity (STDP) rule that we created. We find out that this STDP rule favors chaotic spiking since the number of states is larger in the chaotic SNN than the regular SNN. This chaotic favorability is not general; it is exclusive to this STDP rule only. This research falls under our long-term investigation of STDP and chaos theory.


Sign in / Sign up

Export Citation Format

Share Document