DeepTempo: A Hardware-Friendly Direct Feedback Alignment Multi-Layer Tempotron Learning Rule for Deep Spiking Neural Networks

Author(s):  
Cong Shi ◽  
Tengxiao Wang ◽  
Junxian He ◽  
Jianghao Zhang ◽  
Liyuan Liu ◽  
...  
2021 ◽  
Author(s):  
Ceca Kraišniković ◽  
Wolfgang Maass ◽  
Robert Legenstein

The brain uses recurrent spiking neural networks for higher cognitive functions such as symbolic computations, in particular, mathematical computations. We review the current state of research on spike-based symbolic computations of this type. In addition, we present new results which show that surprisingly small spiking neural networks can perform symbolic computations on bit sequences and numbers and even learn such computations using a biologically plausible learning rule. The resulting networks operate in a rather low firing rate regime, where they could not simply emulate artificial neural networks by encoding continuous values through firing rates. Thus, we propose here a new paradigm for symbolic computation in neural networks that provides concrete hypotheses about the organization of symbolic computations in the brain. The employed spike-based network models are the basis for drastically more energy-efficient computer hardware – neuromorphic hardware. Hence, our results can be seen as creating a bridge from symbolic artificial intelligence to energy-efficient implementation in spike-based neuromorphic hardware.


2020 ◽  
Vol 34 (02) ◽  
pp. 1316-1323
Author(s):  
Zuozhu Liu ◽  
Thiparat Chotibut ◽  
Christopher Hillar ◽  
Shaowei Lin

Motivated by the celebrated discrete-time model of nervous activity outlined by McCulloch and Pitts in 1943, we propose a novel continuous-time model, the McCulloch-Pitts network (MPN), for sequence learning in spiking neural networks. Our model has a local learning rule, such that the synaptic weight updates depend only on the information directly accessible by the synapse. By exploiting asymmetry in the connections between binary neurons, we show that MPN can be trained to robustly memorize multiple spatiotemporal patterns of binary vectors, generalizing the ability of the symmetric Hopfield network to memorize static spatial patterns. In addition, we demonstrate that the model can efficiently learn sequences of binary pictures as well as generative models for experimental neural spike-train data. Our learning rule is consistent with spike-timing-dependent plasticity (STDP), thus providing a theoretical ground for the systematic design of biologically inspired networks with large and robust long-range sequence storage capacity.


2020 ◽  
Vol 2020 ◽  
pp. 1-11
Author(s):  
Xiumin Li ◽  
Hao Yi ◽  
Shengyuan Luo

Electrophysiological studies have shown that mammalian primary visual cortex are selective for the orientations of visual stimuli. Inspired by this mechanism, we propose a hierarchical spiking neural network (SNN) for image classification. Grayscale input images are fed through a feed-forward network consisting of orientation-selective neurons, which then projected to a layer of downstream classifier neurons through the spiking-based supervised tempotron learning rule. Based on the orientation-selective mechanism of the visual cortex and tempotron learning rule, the network can effectively classify images of the extensively studied MNIST database of handwritten digits, which achieves 96 % classification accuracy based on only 2000 training samples (traditional training set is 60000 ). Compared with other classification methods, our model not only guarantees the biological plausibility and the accuracy of image classification but also significantly reduces the needed training samples. Considering the fact that the most commonly used deep learning neural networks need big data samples and high power consumption in image recognition, this brain-inspired computational neural network model based on the layer-by-layer hierarchical image processing mechanism of the visual cortex may provide a basis for the wide application of spiking neural networks in the field of intelligent computing.


2017 ◽  
Vol 241 ◽  
pp. 152-163 ◽  
Author(s):  
Xiurui Xie ◽  
Hong Qu ◽  
Guisong Liu ◽  
Malu Zhang

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Julian Büchel ◽  
Dmitrii Zendrikov ◽  
Sergio Solinas ◽  
Giacomo Indiveri ◽  
Dylan R. Muir

AbstractMixed-signal analog/digital circuits emulate spiking neurons and synapses with extremely high energy efficiency, an approach known as “neuromorphic engineering”. However, analog circuits are sensitive to process-induced variation among transistors in a chip (“device mismatch”). For neuromorphic implementation of Spiking Neural Networks (SNNs), mismatch causes parameter variation between identically-configured neurons and synapses. Each chip exhibits a different distribution of neural parameters, causing deployed networks to respond differently between chips. Current solutions to mitigate mismatch based on per-chip calibration or on-chip learning entail increased design complexity, area and cost, making deployment of neuromorphic devices expensive and difficult. Here we present a supervised learning approach that produces SNNs with high robustness to mismatch and other common sources of noise. Our method trains SNNs to perform temporal classification tasks by mimicking a pre-trained dynamical system, using a local learning rule from non-linear control theory. We demonstrate our method on two tasks requiring temporal memory, and measure the robustness of our approach to several forms of noise and mismatch. We show that our approach is more robust than common alternatives for training SNNs. Our method provides robust deployment of pre-trained networks on mixed-signal neuromorphic hardware, without requiring per-device training or calibration.


IEEE Access ◽  
2020 ◽  
Vol 8 ◽  
pp. 101309-101319 ◽  
Author(s):  
Qian Zhou ◽  
Yan Shi ◽  
Zhenghua Xu ◽  
Ruowei Qu ◽  
Guizhi Xu

Sign in / Sign up

Export Citation Format

Share Document