scholarly journals Spatio-Temporal Pruning and Quantization for Low-latency Spiking Neural Networks

Author(s):  
Sayeed Shafayet Chowdhury ◽  
Isha Garg ◽  
Kaushik Roy
2017 ◽  
Vol 10 (1) ◽  
pp. 35-48 ◽  
Author(s):  
Zohreh Gholami Doborjeh ◽  
Maryam G. Doborjeh ◽  
Nikola Kasabov

2020 ◽  
Author(s):  
Khadeer Ahmed

Brain is a very efficient computing system. It performs very complex tasks while occupying about 2 liters of volume and consuming very little energy. The computation tasks are performed by special cells in the brain called neurons. They compute using electrical pulses and exchange information between them through chemicals called neurotransmitters. With this as inspiration, there are several compute models which exist today trying to exploit the inherent efficiencies demonstrated by nature. The compute models representing spiking neural networks (SNNs) are biologically plausible, hence are used to study and understand the workings of brain and nervous system. More importantly, they are used to solve a wide variety of problems in the field of artificial intelligence (AI). They are uniquely suited to model temporal and spatio-temporal data paradigms. This chapter explores the fundamental concepts of SNNs, few of the popular neuron models, how the information is represented, learning methodologies, and state of the art platforms for implementing and evaluating SNNs along with a discussion on their applications and broader role in the field of AI and data networks.


2021 ◽  
Vol 5 (4) ◽  
pp. 67
Author(s):  
Shirin Dora ◽  
Nikola Kasabov

Deep neural networks with rate-based neurons have exhibited tremendous progress in the last decade. However, the same level of progress has not been observed in research on spiking neural networks (SNN), despite their capability to handle temporal data, energy-efficiency and low latency. This could be because the benchmarking techniques for SNNs are based on the methods used for evaluating deep neural networks, which do not provide a clear evaluation of the capabilities of SNNs. Particularly, the benchmarking of SNN approaches with regards to energy efficiency and latency requires realization in suitable hardware, which imposes additional temporal and resource constraints upon ongoing projects. This review aims to provide an overview of the current real-world applications of SNNs and identifies steps to accelerate research involving SNNs in the future.


Author(s):  
Pengjie Gu ◽  
Rong Xiao ◽  
Gang Pan ◽  
Huajin Tang

The temporal credit assignment problem, which aims to discover the predictive features hidden in distracting background streams with delayed feedback, remains a core challenge in biological and machine learning. To address this issue, we propose a novel spatio-temporal credit assignment algorithm called STCA for training deep spiking neural networks (DSNNs). We present a new spatiotemporal error backpropagation policy by defining a temporal based loss function, which is able to credit the network losses to spatial and temporal domains simultaneously. Experimental results on MNIST dataset and a music dataset (MedleyDB) demonstrate that STCA can achieve comparable performance with other state-of-the-art algorithms with simpler architectures. Furthermore, STCA successfully discovers predictive sensory features and shows the highest performance in the unsegmented sensory event detection tasks.


2014 ◽  
Vol 134 ◽  
pp. 269-279 ◽  
Author(s):  
Nikola Kasabov ◽  
Valery Feigin ◽  
Zeng-Guang Hou ◽  
Yixiong Chen ◽  
Linda Liang ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document