scholarly journals Concept of LIF Neuron Circuit for Rate Coding in Spike Neural Networks

2020 ◽  
Vol 67 (12) ◽  
pp. 3477-3481 ◽  
Author(s):  
Andrei Velichko ◽  
Petr Boriskov
Author(s):  
Xiangyu Chen ◽  
Takeaki Yajima ◽  
Isao H. Inoue ◽  
Tetsuya Iizuka

Abstract Spiking neural networks (SNNs) inspired by biological neurons enable a more realistic mimicry of the human brain. To realize SNNs similar to large-scale biological networks, neuron circuits with high area efficiency are essential. In this paper, we propose a compact leaky integrate-and-fire (LIF) neuron circuit with a long and tunable time constant, which consists of a capacitor and two pseudo resistors (PRs). The prototype chip was fabricated with TSMC 65 nm CMOS technology, and it occupies a die area of 1392 m2. The fabricated LIF neuron has a power consumption of 6 W and a leak time constant of up to 1.2 ms (the resistance of PR is up to 600 MΩ). In addition, the time constants are tunable by changing the bias voltage of PRs. Overall, this proposed neuron circuit facilitates the very-large-scale integration (VLSI) of adaptive SNNs, which is crucial for the implementation of bio-scale brain-inspired computing.


2019 ◽  
Vol 116 (45) ◽  
pp. 22811-22820 ◽  
Author(s):  
Robert Kim ◽  
Yinghao Li ◽  
Terrence J. Sejnowski

Cortical microcircuits exhibit complex recurrent architectures that possess dynamically rich properties. The neurons that make up these microcircuits communicate mainly via discrete spikes, and it is not clear how spikes give rise to dynamics that can be used to perform computationally challenging tasks. In contrast, continuous models of rate-coding neurons can be trained to perform complex tasks. Here, we present a simple framework to construct biologically realistic spiking recurrent neural networks (RNNs) capable of learning a wide range of tasks. Our framework involves training a continuous-variable rate RNN with important biophysical constraints and transferring the learned dynamics and constraints to a spiking RNN in a one-to-one manner. The proposed framework introduces only 1 additional parameter to establish the equivalence between rate and spiking RNN models. We also study other model parameters related to the rate and spiking networks to optimize the one-to-one mapping. By establishing a close relationship between rate and spiking models, we demonstrate that spiking RNNs could be constructed to achieve similar performance as their counterpart continuous rate networks.


Author(s):  
Lei Zhang ◽  
Shengyuan Zhou ◽  
Tian Zhi ◽  
Zidong Du ◽  
Yunji Chen

Continuous-valued deep convolutional networks (DNNs) can be converted into accurate rate-coding based spike neural networks (SNNs). However, the substantial computational and energy costs, which is caused by multiple spikes, limit their use in mobile and embedded applications. And recent works have shown that the newly emerged temporal-coding based SNNs converted from DNNs can reduce the computational load effectively. In this paper, we propose a novel method to convert DNNs to temporal-coding SNNs, called TDSNN. Combined with the characteristic of the leaky integrate-andfire (LIF) neural model, we put forward a new coding principle Reverse Coding and design a novel Ticking Neuron mechanism. According to our evaluation, our proposed method achieves 42% total operations reduction on average in large networks comparing with DNNs with no more than 0.5% accuracy loss. The evaluation shows that TDSNN may prove to be one of the key enablers to make the adoption of SNNs widespread.


2022 ◽  
Vol 5 (1) ◽  
Author(s):  
Takuya Isomura ◽  
Hideaki Shimazaki ◽  
Karl J. Friston

AbstractThis work considers a class of canonical neural networks comprising rate coding models, wherein neural activity and plasticity minimise a common cost function—and plasticity is modulated with a certain delay. We show that such neural networks implicitly perform active inference and learning to minimise the risk associated with future outcomes. Mathematical analyses demonstrate that this biological optimisation can be cast as maximisation of model evidence, or equivalently minimisation of variational free energy, under the well-known form of a partially observed Markov decision process model. This equivalence indicates that the delayed modulation of Hebbian plasticity—accompanied with adaptation of firing thresholds—is a sufficient neuronal substrate to attain Bayes optimal inference and control. We corroborated this proposition using numerical analyses of maze tasks. This theory offers a universal characterisation of canonical neural networks in terms of Bayesian belief updating and provides insight into the neuronal mechanisms underlying planning and adaptive behavioural control.


2019 ◽  
Author(s):  
Robert Kim ◽  
Yinghao Li ◽  
Terrence J. Sejnowski

AbstractCortical microcircuits exhibit complex recurrent architectures that possess dynamically rich properties. The neurons that make up these microcircuits communicate mainly via discrete spikes, and it is not clear how spikes give rise to dynamics that can be used to perform computationally challenging tasks. In contrast, continuous models of rate-coding neurons can be trained to perform complex tasks. Here, we present a simple framework to construct biologically realistic spiking recurrent neural networks (RNNs) capable of learning a wide range of tasks. Our framework involves training a continuous-variable rate RNN with important biophysical constraints and transferring the learned dynamics and constraints to a spiking RNN in a one-to-one manner. The proposed framework introduces only one additional parameter to establish the equivalence between rate and spiking RNN models. We also study other model parameters related to the rate and spiking networks to optimize the one-to-one mapping. By establishing a close relationship between rate and spiking models, we demonstrate that spiking RNNs could be constructed to achieve similar performance as their counterpart continuous rate networks.


1996 ◽  
pp. 561-565
Author(s):  
Nikolaos Doulamis ◽  
Athanasios Tsiodras ◽  
Anastasios Doulamis ◽  
Stefanos Kollias

Sign in / Sign up

Export Citation Format

Share Document