scholarly journals Optimization of Output Spike Train Encoding for a Spiking Neuron Based on its Spatio–Temporal Input Pattern

2020 ◽  
Vol 12 (3) ◽  
pp. 427-438
Author(s):  
Aboozar Taherkhani ◽  
Georgina Cosma ◽  
Thomas Martin McGinnity
1977 ◽  
Vol 40 (3) ◽  
pp. 626-646 ◽  
Author(s):  
C. K. Knox ◽  
S. Kubota ◽  
R. E. Poppele

1. Responses of DSCT neurons to random electrical stimulation of peripheral nerves of the hindleg at group I intensity were studied using cross-correlation analysis of the output spike train with the stimulus. Three types of response were found: type 1 was due to monosynaptic activation of DSCT cells, type 2 resulted from inhibition of those cells, and type 3 was due to a long-latency excitation that was probably polysynaptic. 2. Most of the units studied responded to stimulation of both proximal and distal flexor and extensor nerves. The extensive convergence of afferent input on DSCT cells is much greater than has been observed previously, with type 2 and type 3 responses totaling 80% of the observed responses. We attribute this to the sensitivity of the analysis in detecting small changes in postsynaptic excitability. 3. The results of the study, particularly the derivation of postsynaptic excitability changes, generally confirm those of earlier work employing intracellular recording. 4. By varying stimulus rate and stimulus intensity in the group 1 range and simulating the resulting correlations, we conclude that excitability changes in DSCT cells are the net result of complex interactions involving excitation and inhibition. A summary of these findings is presented as a model for the minimum circuitry necessary to account for the observed behavior.


2021 ◽  
Author(s):  
Giuseppe de Alteriis ◽  
Enrico Cataldo ◽  
Alberto Mazzoni ◽  
Calogero Maria Oddo

The Izhikevich artificial spiking neuron model is among the most employed models in neuromorphic engineering and computational neuroscience, due to the affordable computational effort to discretize it and its biological plausibility. It has been adopted also for applications with limited computational resources in embedded systems. It is important therefore to realize a compromise between error and computational expense to solve numerically the model's equations. Here we investigate the effects of discretization and we study the solver that realizes the best compromise between accuracy and computational cost, given an available amount of Floating Point Operations per Second (FLOPS). We considered three frequently used fixed step Ordinary Differential Equations (ODE) solvers in computational neuroscience: Euler method, the Runge-Kutta 2 method and the Runge-Kutta 4 method. To quantify the error produced by the solvers, we used the Victor Purpura spike train Distance from an ideal solution of the ODE. Counterintuitively, we found that simple methods such as Euler and Runge Kutta 2 can outperform more complex ones (i.e. Runge Kutta 4) in the numerical solution of the Izhikevich model if the same FLOPS are allocated in the comparison. Moreover, we quantified the neuron rest time (with input under threshold resulting in no output spikes) necessary for the numerical solution to converge to the ideal solution and therefore to cancel the error accumulated during the spike train; in this analysis we found that the required rest time is independent from the firing rate and the spike train duration. Our results can generalize in a straightforward manner to other spiking neuron models and provide a systematic analysis of fixed step neural ODE solvers towards a trade-off between accuracy in the discretization and computational cost.


1977 ◽  
Vol 40 (3) ◽  
pp. 616-625 ◽  
Author(s):  
C. K. Knox ◽  
R. E. Poppele

1. Theoretical expressions for the cross-correlation function are described which relate the output spike train of a neuron to an input spike train. The cross-correlation function is related to a convolution integral of two functions: 1) a waiting-time density, which describes the probability of observing the next succeeding output spike given an arbitrary input; and 2) a conditional output autocorrelation function, which contains information related to the statistical properties of the output spike train itself, and to the carry-over of the effects of an input to subsequent intervals. 2. The primary synaptic effect appears in the cross-correlation function as a distorted version of the derivative of the PSP. Depending on the duration of the evoked excitability change, as compared to the mean output interspike interval, periodicities due to the spontaneous activity of the cell appear to a greater or lesser extent in the cross-correlation. 3. To estimate underlying excitability changes using correlation techniques, one must estimate both the cross- and the conditional output autocorrelation functions. In cases when the excitability changes are short and do not carry forward to subsequent intervals, the more readily estimated unconditional output autocorrelation can be used in place of the conditional correlation.


2013 ◽  
Vol 107 ◽  
pp. 3-10 ◽  
Author(s):  
Ammar Mohemmed ◽  
Stefan Schliebs ◽  
Satoshi Matsuda ◽  
Nikola Kasabov

Author(s):  
Pietro Quaglio ◽  
Alper Yegenoglu ◽  
Emiliano Torre ◽  
Dominik M. Endres ◽  
Sonja Grün

2012 ◽  
Vol 22 (04) ◽  
pp. 1250012 ◽  
Author(s):  
AMMAR MOHEMMED ◽  
STEFAN SCHLIEBS ◽  
SATOSHI MATSUDA ◽  
NIKOLA KASABOV

Spiking Neural Networks (SNN) were shown to be suitable tools for the processing of spatio-temporal information. However, due to their inherent complexity, the formulation of efficient supervised learning algorithms for SNN is difficult and remains an important problem in the research area. This article presents SPAN — a spiking neuron that is able to learn associations of arbitrary spike trains in a supervised fashion allowing the processing of spatio-temporal information encoded in the precise timing of spikes. The idea of the proposed algorithm is to transform spike trains during the learning phase into analog signals so that common mathematical operations can be performed on them. Using this conversion, it is possible to apply the well-known Widrow–Hoff rule directly to the transformed spike trains in order to adjust the synaptic weights and to achieve a desired input/output spike behavior of the neuron. In the presented experimental analysis, the proposed learning algorithm is evaluated regarding its learning capabilities, its memory capacity, its robustness to noisy stimuli and its classification performance. Differences and similarities of SPAN regarding two related algorithms, ReSuMe and Chronotron, are discussed.


Sign in / Sign up

Export Citation Format

Share Document