Simulation-Based Ultralow Energy and High-Speed LIF Neuron Using Silicon Bipolar Impact Ionization MOSFET for Spiking Neural Networks

2020 ◽  
Vol 67 (6) ◽  
pp. 2600-2606
Author(s):  
Alok Kumar Kamal ◽  
Jawar Singh
2021 ◽  
Vol 15 ◽  
Author(s):  
Yihan Lin ◽  
Wei Ding ◽  
Shaohua Qiang ◽  
Lei Deng ◽  
Guoqi Li

With event-driven algorithms, especially spiking neural networks (SNNs), achieving continuous improvement in neuromorphic vision processing, a more challenging event-stream dataset is urgently needed. However, it is well-known that creating an ES-dataset is a time-consuming and costly task with neuromorphic cameras like dynamic vision sensors (DVS). In this work, we propose a fast and effective algorithm termed Omnidirectional Discrete Gradient (ODG) to convert the popular computer vision dataset ILSVRC2012 into its event-stream (ES) version, generating about 1,300,000 frame-based images into ES-samples in 1,000 categories. In this way, we propose an ES-dataset called ES-ImageNet, which is dozens of times larger than other neuromorphic classification datasets at present and completely generated by the software. The ODG algorithm implements image motion to generate local value changes with discrete gradient information in different directions, providing a low-cost and high-speed method for converting frame-based images into event streams, along with Edge-Integral to reconstruct the high-quality images from event streams. Furthermore, we analyze the statistics of ES-ImageNet in multiple ways, and a performance benchmark of the dataset is also provided using both famous deep neural network algorithms and spiking neural network algorithms. We believe that this work shall provide a new large-scale benchmark dataset for SNNs and neuromorphic vision.


2018 ◽  
Vol 39 (12) ◽  
pp. 1832-1835 ◽  
Author(s):  
B. Das ◽  
J. Schulze ◽  
U. Ganguly

2012 ◽  
Vol 22 (04) ◽  
pp. 1250014 ◽  
Author(s):  
JOSEP L. ROSSELLÓ ◽  
VINCENT CANALS ◽  
ANTONI MORRO ◽  
ANTONI OLIVER

Spiking Neural Networks, the last generation of Artificial Neural Networks, are characterized by its bio-inspired nature and by a higher computational capacity with respect to other neural models. In real biological neurons, stochastic processes represent an important mechanism of neural behavior and are responsible of its special arithmetic capabilities. In this work we present a simple hardware implementation of spiking neurons that considers this probabilistic nature. The advantage of the proposed implementation is that it is fully digital and therefore can be massively implemented in Field Programmable Gate Arrays. The high computational capabilities of the proposed model are demonstrated by the study of both feed-forward and recurrent networks that are able to implement high-speed signal filtering and to solve complex systems of linear equations.


Author(s):  
Xiangyu Chen ◽  
Takeaki Yajima ◽  
Isao H. Inoue ◽  
Tetsuya Iizuka

Abstract Spiking neural networks (SNNs) inspired by biological neurons enable a more realistic mimicry of the human brain. To realize SNNs similar to large-scale biological networks, neuron circuits with high area efficiency are essential. In this paper, we propose a compact leaky integrate-and-fire (LIF) neuron circuit with a long and tunable time constant, which consists of a capacitor and two pseudo resistors (PRs). The prototype chip was fabricated with TSMC 65 nm CMOS technology, and it occupies a die area of 1392 m2. The fabricated LIF neuron has a power consumption of 6 W and a leak time constant of up to 1.2 ms (the resistance of PR is up to 600 MΩ). In addition, the time constants are tunable by changing the bias voltage of PRs. Overall, this proposed neuron circuit facilitates the very-large-scale integration (VLSI) of adaptive SNNs, which is crucial for the implementation of bio-scale brain-inspired computing.


2012 ◽  
Vol 35 (12) ◽  
pp. 2633 ◽  
Author(s):  
Xiang-Hong LIN ◽  
Tian-Wen ZHANG ◽  
Gui-Cang ZHANG

2020 ◽  
Vol 121 ◽  
pp. 88-100 ◽  
Author(s):  
Jesus L. Lobo ◽  
Javier Del Ser ◽  
Albert Bifet ◽  
Nikola Kasabov

Sign in / Sign up

Export Citation Format

Share Document