Research on Virtual-Switch Characteristics of Noise-Enhanced Stochastic Resonance Using Leaky Integrate-and-Fire Model

2010 ◽  
Vol 439-440 ◽  
pp. 1324-1327
Author(s):  
Guo Hua Hui

Virtual-switch characteristics of noise-induced stochastic resonance using leaky integrate-and-fire (LIF) model was focused in this article. For LIF neuron model, Gaussian white noise was added to the simulating system. The condition of spike-starting was studied through the system. With the increase of noise intensity, the stimulated spikes became more and more intensive. With proper white noise intensity, the amount of excitatory spikes was enough to generate spike trains. We expected to design a virtual-switch utilizing this mechanism for neural data processing field.

2008 ◽  
Vol 08 (02) ◽  
pp. L229-L235 ◽  
Author(s):  
LEI ZHANG ◽  
JUN HE ◽  
AIGUO SONG

Recently, it was reported that some saturation nonlinearities could effectively act as noise-aided signal-noise-ratio amplifiers. In the letter we consider the signal detection performance of saturation nonlinearities driven by a sinusoidal signal buried in Gaussian white noise. It is showed that the signal detection statistics still undergo a nonmonotonic evolution as noise is raised. We also particularly show that an improvement of the SNR in terms of the first harmonic does not imply the possibility to improve the signal detection performance through stochastic resonance. The study might also complement other reports about stochastic resonance in saturation nonlinearities.


2004 ◽  
Vol 92 (2) ◽  
pp. 959-976 ◽  
Author(s):  
Renaud Jolivet ◽  
Timothy J. Lewis ◽  
Wulfram Gerstner

We demonstrate that single-variable integrate-and-fire models can quantitatively capture the dynamics of a physiologically detailed model for fast-spiking cortical neurons. Through a systematic set of approximations, we reduce the conductance-based model to 2 variants of integrate-and-fire models. In the first variant (nonlinear integrate-and-fire model), parameters depend on the instantaneous membrane potential, whereas in the second variant, they depend on the time elapsed since the last spike [Spike Response Model (SRM)]. The direct reduction links features of the simple models to biophysical features of the full conductance-based model. To quantitatively test the predictive power of the SRM and of the nonlinear integrate-and-fire model, we compare spike trains in the simple models to those in the full conductance-based model when the models are subjected to identical randomly fluctuating input. For random current input, the simple models reproduce 70–80 percent of the spikes in the full model (with temporal precision of ±2 ms) over a wide range of firing frequencies. For random conductance injection, up to 73 percent of spikes are coincident. We also present a technique for numerically optimizing parameters in the SRM and the nonlinear integrate-and-fire model based on spike trains in the full conductance-based model. This technique can be used to tune simple models to reproduce spike trains of real neurons.


2019 ◽  
Vol 79 (2) ◽  
pp. 509-532 ◽  
Author(s):  
Marius E. Yamakou ◽  
Tat Dat Tran ◽  
Luu Hoang Duc ◽  
Jürgen Jost

1996 ◽  
Vol 53 (1) ◽  
pp. 1273-1275 ◽  
Author(s):  
François Chapeau-Blondeau ◽  
Xavier Godivier ◽  
Nicolas Chambet

2016 ◽  
Vol 2016 ◽  
pp. 1-7 ◽  
Author(s):  
Peiming Shi ◽  
Pei Li ◽  
Shujun An ◽  
Dongying Han

Stochastic resonance (SR) is investigated in a multistable system driven by Gaussian white noise. Using adiabatic elimination theory and three-state theory, the signal-to-noise ratio (SNR) is derived. We find the effects of the noise intensity and the resonance system parametersb,c, anddon the SNR; the results show that SNR is a nonmonotonic function of the noise intensity; therefore, a multistable SR is found in this system, and the value of the peak changes with changing the system parameters.


2008 ◽  
Vol 18 (09) ◽  
pp. 2833-2839 ◽  
Author(s):  
N. V. AGUDOV ◽  
A. V. KRICHIGIN

The phenomena of stochastic resonance is studied in overdamped nonlinear monostable systems driven by a periodic signal and Gaussian white noise. It is shown that the signal power amplification as a function of input noise intensity can be different depending on nonlinearity: it can monotonically grow, decrease and it can reach a maximum at certain value of the noise intensity. Nevertheless, the output signal to noise ratio is shown to be always a decreasing function of input noise intensity.


2003 ◽  
Vol 15 (8) ◽  
pp. 1761-1788 ◽  
Author(s):  
Benjamin Lindner ◽  
André Longtin ◽  
Adi Bulsara

We study the one-dimensional normal form of a saddle-node system under the influence of additive gaussian white noise and a static “bias current” input parameter, a model that can be looked upon as the simplest version of a type I neuron with stochastic input. This is in contrast with the numerous studies devoted to the noise-driven leaky integrate-and-fire neuron. We focus on the firing rate and coefficient of variation (CV) of the interspike interval density, for which scaling relations with respect to the input parameter and noise intensity are derived. Quadrature formulas for rate and CV are numerically evaluated and compared to numerical simulations of the system and to various approximation formulas obtained in different limiting cases of the model. We also show that caution must be used to extend these results to the neuron model with multiplicative gaussian white noise. The correspondence between the first passage time statistics for the saddle-node model and the neuron model is obtained only in the Stratonovich interpretation of the stochastic neuron model, while previous results have focused only on the Ito interpretation. The correct Stratonovich interpretation yields CVs that are still relatively high, although smaller than in the Ito interpretation; it also produces certain qualitative differences, especially at larger noise intensities. Our analysis provides useful relations for assessing the distance to threshold and the level of synaptic noise in real type I neurons from their firing statistics. We also briefly discuss the effect of finite boundaries (finite values of threshold and reset) on the firing statistics.


2012 ◽  
Vol 67 (2) ◽  
pp. 239-259 ◽  
Author(s):  
Susanne Ditlevsen ◽  
Priscilla Greenwood

2020 ◽  
Author(s):  
Ismael Jaras ◽  
Taiki Harada ◽  
Marcos E. Orchard ◽  
Pedro E. Maldonado ◽  
Rodrigo C. Vergara

AbstractIt is widely accepted that the brain, like any other physical system, is subjected to physical constraints restricting its operation. The brain’s metabolic demands are particularly critical for proper neuronal function, but the impact of these constraints is still poorly understood. Detailed single-neuron models are recently integrating metabolic constraints, but the computational resources these models need, make it difficult to explore the dynamics of extended neural networks imposed by such constraints. Thus, there is a need for a simple-enough neuron model that incorporates metabolic activity and allows us to explore neural network dynamics. This work introduces an energy-dependent leaky integrate-and-fire (LIF) neuronal model extension to account for the effects of metabolic constraints on the single-neuron behavior (EDLIF). This simple energy-dependent model shows better performance predicting real spikes trains -in spike coincidence measure sense-than the classical leaky integrate-and-fire model. It can describe the relationship between the average firing rate and the ATP cost, and replicate a neuron’s behavior under a clinical setting such as amyotrophic lateral sclerosis. The simplicity of the energy-dependent model presented here, makes it computationally efficient and thus, suitable to study the dynamics of large neural networks.Author summaryAny physical system or biological tissue is restricted by physical constraints bounding their behavior, and the brain is not free from these constraints. Energetic disorders in the brain have been linked to several neurodegenerative diseases, highlighting the relevance of maintaining a critical balance between energy production and consumption in neurons. These observations motivate the development of mathematical tools that can help to understand the dependence of the brain’s behavior in metabolism. One of the essential building blocks to achieve this task is the mathematical representation of neurons through models, allowing computational simulations of single-neurons and neural networks. Here we construct a simple and computational cheap energy-dependent neuron model that allows the study of neuron’s behavior under an energetic perspective. The introduced neuron model is contrasted with one of the widest-used neuron models and shows better prediction capabilities when real neuron recordings are used. Our model is suitable for replicating neuron’s behavior under a specific neurodegenerative disease, which cannot be achieved by the abovementioned popular model. Our simple model is promising because it allows the simulation and study of neuronal networks under a metabolic-dependent perspective.


Sign in / Sign up

Export Citation Format

Share Document