The Journal of Mathematical Neuroscience
Latest Publications


TOTAL DOCUMENTS

158
(FIVE YEARS 41)

H-INDEX

21
(FIVE YEARS 3)

Published By Springer (Biomed Central Ltd.)

2190-8567, 2190-8567

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Elif Köksal Ersöz ◽  
Fabrice Wendling

AbstractMathematical models at multiple temporal and spatial scales can unveil the fundamental mechanisms of critical transitions in brain activities. Neural mass models (NMMs) consider the average temporal dynamics of interconnected neuronal subpopulations without explicitly representing the underlying cellular activity. The mesoscopic level offered by the neural mass formulation has been used to model electroencephalographic (EEG) recordings and to investigate various cerebral mechanisms, such as the generation of physiological and pathological brain activities. In this work, we consider a NMM widely accepted in the context of epilepsy, which includes four interacting neuronal subpopulations with different synaptic kinetics. Due to the resulting three-time-scale structure, the model yields complex oscillations of relaxation and bursting types. By applying the principles of geometric singular perturbation theory, we unveil the existence of the canard solutions and detail how they organize the complex oscillations and excitability properties of the model. In particular, we show that boundaries between pathological epileptic discharges and physiological background activity are determined by the canard solutions. Finally we report the existence of canard-mediated small-amplitude frequency-specific oscillations in simulated local field potentials for decreased inhibition conditions. Interestingly, such oscillations are actually observed in intracerebral EEG signals recorded in epileptic patients during pre-ictal periods, close to seizure onsets.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Erik D. Fagerholm ◽  
W. M. C. Foulkes ◽  
Karl J. Friston ◽  
Rosalyn J. Moran ◽  
Robert Leech

AbstractThe principle of stationary action is a cornerstone of modern physics, providing a powerful framework for investigating dynamical systems found in classical mechanics through to quantum field theory. However, computational neuroscience, despite its heavy reliance on concepts in physics, is anomalous in this regard as its main equations of motion are not compatible with a Lagrangian formulation and hence with the principle of stationary action. Taking the Dynamic Causal Modelling (DCM) neuronal state equation as an instructive archetype of the first-order linear differential equations commonly found in computational neuroscience, we show that it is possible to make certain modifications to this equation to render it compatible with the principle of stationary action. Specifically, we show that a Lagrangian formulation of the DCM neuronal state equation is facilitated using a complex dependent variable, an oscillatory solution, and a Hermitian intrinsic connectivity matrix. We first demonstrate proof of principle by using Bayesian model inversion to show that both the original and modified models can be correctly identified via in silico data generated directly from their respective equations of motion. We then provide motivation for adopting the modified models in neuroscience by using three different types of publicly available in vivo neuroimaging datasets, together with open source MATLAB code, to show that the modified (oscillatory) model provides a more parsimonious explanation for some of these empirical timeseries. It is our hope that this work will, in combination with existing techniques, allow people to explore the symmetries and associated conservation laws within neural systems – and to exploit the computational expediency facilitated by direct variational techniques.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Karina Kolodina ◽  
John Wyller ◽  
Anna Oleynik ◽  
Mads Peter Sørensen

AbstractWe study pattern formation in a 2-population homogenized neural field model of the Hopfield type in one spatial dimension with periodic microstructure. The connectivity functions are periodically modulated in both the synaptic footprint and in the spatial scale. It is shown that the nonlocal synaptic interactions promote a finite band width instability. The stability method relies on a sequence of wave-number dependent invariants of $2\times 2$ 2 × 2 -stability matrices representing the sequence of Fourier-transformed linearized evolution equations for the perturbation imposed on the homogeneous background. The generic picture of the instability structure consists of a finite set of well-separated gain bands. In the shallow firing rate regime the nonlinear development of the instability is determined by means of the translational invariant model with connectivity kernels replaced with the corresponding period averaged connectivity functions. In the steep firing rate regime the pattern formation process depends sensitively on the spatial localization of the connectivity kernels: For strongly localized kernels this process is determined by the translational invariant model with period averaged connectivity kernels, whereas in the complementary regime of weak and moderate localization requires the homogenized model as a starting point for the analysis. We follow the development of the instability numerically into the nonlinear regime for both steep and shallow firing rate functions when the connectivity kernels are modeled by means of an exponentially decaying function. We also study the pattern forming process numerically as a function of the heterogeneity parameters in four different regimes ranging from the weakly modulated case to the strongly heterogeneous case. For the weakly modulated regime, we observe that stable spatial oscillations are formed in the steep firing rate regime, whereas we get spatiotemporal oscillations in the shallow regime of the firing rate functions.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Andrea Ferrario ◽  
James Rankin

AbstractIn the auditory streaming paradigm, alternating sequences of pure tones can be perceived as a single galloping rhythm (integration) or as two sequences with separated low and high tones (segregation). Although studied for decades, the neural mechanisms underlining this perceptual grouping of sound remains a mystery. With the aim of identifying a plausible minimal neural circuit that captures this phenomenon, we propose a firing rate model with two periodically forced neural populations coupled by fast direct excitation and slow delayed inhibition. By analyzing the model in a non-smooth, slow-fast regime we analytically prove the existence of a rich repertoire of dynamical states and of their parameter dependent transitions. We impose plausible parameter restrictions and link all states with perceptual interpretations. Regions of stimulus parameters occupied by states linked with each percept match those found in behavioural experiments. Our model suggests that slow inhibition masks the perception of subsequent tones during segregation (forward masking), whereas fast excitation enables integration for large pitch differences between the two tones.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Hugues Berry ◽  
Stéphane Genet

AbstractThe neurons of the deep cerebellar nuclei (DCNn) represent the main functional link between the cerebellar cortex and the rest of the central nervous system. Therefore, understanding the electrophysiological properties of DCNn is of fundamental importance to understand the overall functioning of the cerebellum. Experimental data suggest that DCNn can reversibly switch between two states: the firing of spikes (F state) and a stable depolarized state (SD state). We introduce a new biophysical model of the DCNn membrane electro-responsiveness to investigate how the interplay between the documented conductances identified in DCNn give rise to these states. In the model, the F state emerges as an isola of limit cycles, i.e. a closed loop of periodic solutions disconnected from the branch of SD fixed points. This bifurcation structure endows the model with the ability to reproduce the $\text{F}\to \text{SD}$ F → SD transition triggered by hyperpolarizing current pulses. The model also reproduces the $\text{F}\to \text{SD}$ F → SD transition induced by blocking Ca currents and ascribes this transition to the blocking of the high-threshold Ca current. The model suggests that intracellular current injections can trigger fully reversible $\text{F}\leftrightarrow \text{SD}$ F ↔ SD transitions. Investigation of low-dimension reduced models suggests that the voltage-dependent Na current is prominent for these dynamical features. Finally, simulations of the model suggest that physiological synaptic inputs may trigger $\text{F}\leftrightarrow \text{SD}$ F ↔ SD transitions. These transitions could explain the puzzling observation of positively correlated activities of connected Purkinje cells and DCNn despite the former inhibit the latter.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Matias Calderini ◽  
Jean-Philippe Thivierge

AbstractDecoding approaches provide a useful means of estimating the information contained in neuronal circuits. In this work, we analyze the expected classification error of a decoder based on Fisher linear discriminant analysis. We provide expressions that relate decoding error to the specific parameters of a population model that performs linear integration of sensory input. Results show conditions that lead to beneficial and detrimental effects of noise correlation on decoding. Further, the proposed framework sheds light on the contribution of neuronal noise, highlighting cases where, counter-intuitively, increased noise may lead to improved decoding performance. Finally, we examined the impact of dynamical parameters, including neuronal leak and integration time constant, on decoding. Overall, this work presents a fruitful approach to the study of decoding using a comprehensive theoretical framework that merges dynamical parameters with estimates of readout error.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Isam Al-Darabsah ◽  
Sue Ann Campbell

AbstractIn this work, we consider a general conductance-based neuron model with the inclusion of the acetycholine sensitive, M-current. We study bifurcations in the parameter space consisting of the applied current $I_{app}$ I a p p , the maximal conductance of the M-current $g_{M}$ g M and the conductance of the leak current $g_{L}$ g L . We give precise conditions for the model that ensure the existence of a Bogdanov–Takens (BT) point and show that such a point can occur by varying $I_{app}$ I a p p and $g_{M}$ g M . We discuss the case when the BT point becomes a Bogdanov–Takens–cusp (BTC) point and show that such a point can occur in the three-dimensional parameter space. The results of the bifurcation analysis are applied to different neuronal models and are verified and supplemented by numerical bifurcation diagrams generated using the package . We conclude that there is a transition in the neuronal excitability type organised by the BT point and the neuron switches from Class-I to Class-II as conductance of the M-current increases.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Antonios Georgiou ◽  
Mikhail Katkov ◽  
Misha Tsodyks

AbstractMemory and forgetting constitute two sides of the same coin, and although the first has been extensively investigated, the latter is often overlooked. A possible approach to better understand forgetting is to develop phenomenological models that implement its putative mechanisms in the most elementary way possible, and then experimentally test the theoretical predictions of these models. One such mechanism proposed in previous studies is retrograde interference, stating that a memory can be erased due to subsequently acquired memories. In the current contribution, we hypothesize that retrograde erasure is controlled by the relevant “importance” measures such that more important memories eliminate less important ones acquired earlier. We show that some versions of the resulting mathematical model are broadly compatible with the previously reported power-law forgetting time course and match well the results of our recognition experiments with long, randomly assembled streams of words.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Selma Souihel ◽  
Bruno Cessac

AbstractWe analyse the potential effects of lateral connectivity (amacrine cells and gap junctions) on motion anticipation in the retina. Our main result is that lateral connectivity can—under conditions analysed in the paper—trigger a wave of activity enhancing the anticipation mechanism provided by local gain control (Berry et al. in Nature 398(6725):334–338, 1999; Chen et al. in J. Neurosci. 33(1):120–132, 2013). We illustrate these predictions by two examples studied in the experimental literature: differential motion sensitive cells (Baccus and Meister in Neuron 36(5):909–919, 2002) and direction sensitive cells where direction sensitivity is inherited from asymmetry in gap junctions connectivity (Trenholm et al. in Nat. Neurosci. 16:154–156, 2013). We finally present reconstructions of retinal responses to 2D visual inputs to assess the ability of our model to anticipate motion in the case of three different 2D stimuli.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Ugo Boscain ◽  
Dario Prandi ◽  
Ludovic Sacchelli ◽  
Giuseppina Turco

AbstractThe reconstruction mechanisms built by the human auditory system during sound reconstruction are still a matter of debate. The purpose of this study is to propose a mathematical model of sound reconstruction based on the functional architecture of the auditory cortex (A1). The model is inspired by the geometrical modelling of vision, which has undergone a great development in the last ten years. There are, however, fundamental dissimilarities, due to the different role played by time and the different group of symmetries. The algorithm transforms the degraded sound in an ‘image’ in the time–frequency domain via a short-time Fourier transform. Such an image is then lifted to the Heisenberg group and is reconstructed via a Wilson–Cowan integro-differential equation. Preliminary numerical experiments are provided, showing the good reconstruction properties of the algorithm on synthetic sounds concentrated around two frequencies.


Sign in / Sign up

Export Citation Format

Share Document