scholarly journals An ultra-compact leaky integrate-and-fire neuron with long and tunable time constant utilizing pseudo resistors for spiking neural networks

Author(s):  
Xiangyu Chen ◽  
Takeaki Yajima ◽  
Isao H. Inoue ◽  
Tetsuya Iizuka

Abstract Spiking neural networks (SNNs) inspired by biological neurons enable a more realistic mimicry of the human brain. To realize SNNs similar to large-scale biological networks, neuron circuits with high area efficiency are essential. In this paper, we propose a compact leaky integrate-and-fire (LIF) neuron circuit with a long and tunable time constant, which consists of a capacitor and two pseudo resistors (PRs). The prototype chip was fabricated with TSMC 65 nm CMOS technology, and it occupies a die area of 1392 m2. The fabricated LIF neuron has a power consumption of 6 W and a leak time constant of up to 1.2 ms (the resistance of PR is up to 600 MΩ). In addition, the time constants are tunable by changing the bias voltage of PRs. Overall, this proposed neuron circuit facilitates the very-large-scale integration (VLSI) of adaptive SNNs, which is crucial for the implementation of bio-scale brain-inspired computing.

2012 ◽  
Vol 35 (12) ◽  
pp. 2633 ◽  
Author(s):  
Xiang-Hong LIN ◽  
Tian-Wen ZHANG ◽  
Gui-Cang ZHANG

2007 ◽  
Vol 19 (12) ◽  
pp. 3226-3238 ◽  
Author(s):  
Arnaud Tonnelier ◽  
Hana Belmabrouk ◽  
Dominique Martinez

Event-driven strategies have been used to simulate spiking neural networks exactly. Previous work is limited to linear integrate-and-fire neurons. In this note, we extend event-driven schemes to a class of nonlinear integrate-and-fire models. Results are presented for the quadratic integrate-and-fire model with instantaneous or exponential synaptic currents. Extensions to conductance-based currents and exponential integrate-and-fire neurons are discussed.


Materials ◽  
2019 ◽  
Vol 12 (17) ◽  
pp. 2745 ◽  
Author(s):  
Luis Camuñas-Mesa ◽  
Bernabé Linares-Barranco ◽  
Teresa Serrano-Gotarredona

Inspired by biology, neuromorphic systems have been trying to emulate the human brain for decades, taking advantage of its massive parallelism and sparse information coding. Recently, several large-scale hardware projects have demonstrated the outstanding capabilities of this paradigm for applications related to sensory information processing. These systems allow for the implementation of massive neural networks with millions of neurons and billions of synapses. However, the realization of learning strategies in these systems consumes an important proportion of resources in terms of area and power. The recent development of nanoscale memristors that can be integrated with Complementary Metal–Oxide–Semiconductor (CMOS) technology opens a very promising solution to emulate the behavior of biological synapses. Therefore, hybrid memristor-CMOS approaches have been proposed to implement large-scale neural networks with learning capabilities, offering a scalable and lower-cost alternative to existing CMOS systems.


Sign in / Sign up

Export Citation Format

Share Document