scholarly journals On the effect of remote and proximal distractors on saccadic behavior: A challenge to neural-field models

2012 ◽  
Vol 12 (12) ◽  
pp. 14-14 ◽  
Author(s):  
S. Casteau ◽  
F. Vitu
2002 ◽  
Vol 14 (8) ◽  
pp. 1801-1825 ◽  
Author(s):  
Thomas Wennekers

This article presents an approximation method to reduce the spatiotemporal behavior of localized activation peaks (also called “bumps”) in nonlinear neural field equations to a set of coupled ordinary differential equations (ODEs) for only the amplitudes and tuning widths of these peaks. This enables a simplified analysis of steady-state receptive fields and their stability, as well as spatiotemporal point spread functions and dynamic tuning properties. A lowest-order approximation for peak amplitudes alone shows that much of the well-studied behavior of small neural systems (e.g., the Wilson-Cowan oscillator) should carry over to localized solutions in neural fields. Full spatiotemporal response profiles can further be reconstructed from this low-dimensional approximation. The method is applied to two standard neural field models: a one-layer model with difference-of-gaussians connectivity kernel and a two-layer excitatory-inhibitory network. Similar models have been previously employed in numerical studies addressing orientation tuning of cortical simple cells. Explicit formulas for tuning properties, instabilities, and oscillation frequencies are given, and exemplary spatiotemporal response functions, reconstructed from the low-dimensional approximation, are compared with full network simulations.


2019 ◽  
Vol 15 (11) ◽  
pp. e1007442
Author(s):  
Michael E. Rule ◽  
David Schnoerr ◽  
Matthias H. Hennig ◽  
Guido Sanguinetti

Author(s):  
Dimitris A. Pinotsis ◽  
Marco Leite ◽  
Karl J. Friston

2009 ◽  
Vol 102 (2) ◽  
pp. 145-154 ◽  
Author(s):  
Serafim Rodrigues ◽  
David Barton ◽  
Frank Marten ◽  
Moses Kibuuka ◽  
Gonzalo Alarcon ◽  
...  

2015 ◽  
Vol 297 ◽  
pp. 88-101 ◽  
Author(s):  
K. Dijkstra ◽  
S.A. van Gils ◽  
S.G. Janssens ◽  
Yu.A. Kuznetsov ◽  
S. Visser

2020 ◽  
Vol 10 (1) ◽  
Author(s):  
Len Spek ◽  
Yuri A. Kuznetsov ◽  
Stephan A. van Gils

AbstractA neural field models the large scale behaviour of large groups of neurons. We extend previous results for these models by including a diffusion term into the neural field, which models direct, electrical connections. We extend known and prove new sun-star calculus results for delay equations to be able to include diffusion and explicitly characterise the essential spectrum. For a certain class of connectivity functions in the neural field model, we are able to compute its spectral properties and the first Lyapunov coefficient of a Hopf bifurcation. By examining a numerical example, we find that the addition of diffusion suppresses non-synchronised steady-states while favouring synchronised oscillatory modes.


2019 ◽  
Author(s):  
M. E. Rule ◽  
D. Schnoerr ◽  
M. H. Hennig ◽  
G. Sanguinetti

AbstractLarge-scale neural recordings are becoming increasingly better at providing a window into functional neural networks in the living organism. Interpreting such rich data sets, however, poses fundamental statistical challenges. The neural field models of Wilson, Cowan and colleagues remain the mainstay of mathematical population modeling owing to their interpretable, mechanistic parameters and amenability to mathematical analysis. We developed a method based on moment closure to interpret neural field models as latent state-space point-process models, making mean field models amenable to statistical inference. We demonstrate that this approach can infer latent neural states, such as active and refractory neurons, in large populations. After validating this approach with synthetic data, we apply it to high-density recordings of spiking activity in the developing mouse retina. This confirms the essential role of a long lasting refractory state in shaping spatio-temporal properties of neonatal retinal waves. This conceptual and methodological advance opens up new theoretical connections between mathematical theory and point-process state-space models in neural data analysis.SignificanceDeveloping statistical tools to connect single-neuron activity to emergent collective dynamics is vital for building interpretable models of neural activity. Neural field models relate single-neuron activity to emergent collective dynamics in neural populations, but integrating them with data remains challenging. Recently, latent state-space models have emerged as a powerful tool for constructing phenomenological models of neural population activity. The advent of high-density multi-electrode array recordings now enables us to examine large-scale collective neural activity. We show that classical neural field approaches can yield latent statespace equations and demonstrate inference for a neural field model of excitatory spatiotemporal waves that emerge in the developing retina.


2001 ◽  
Vol 13 (8) ◽  
pp. 1721-1747 ◽  
Author(s):  
Thomas Wennekers

We present a general approximation method for the mathematical analysis of spatially localized steady-state solutions in nonlinear neural field models. These models comprise several layers of excitatory and inhibitory cells. Coupling kernels between and inside layers are assumed to be gaussian shaped. In response to spatially localized (i.e., tuned) inputs, such networks typically reveal stationary localized activity profiles in the different layers. Qualitative properties of these solutions, like response amplitudes and tuning widths, are approximated for a whole class of nonlinear rate functions that obey a power law above some threshold and that are zero below. A special case of these functions is the semilinear function, which is commonly used in neural field models. The method is then applied to models for orientation tuning in cortical simple cells: first, to the one-layer model with “difference of gaussians” connectivity kernel developed by Carandini and Ringach (1997) as an abstraction of the biologically detailed simulations of Somers, Nelson, and Sur (1995); second, to a two-field model comprising excitatory and inhibitory cells in two separate layers. Under certain conditions, both models have the same steady states. Comparing simulations of the field models and results derived from the approximation method, we find that the approximation well predicts the tuning behavior of the full model. Moreover, explicit formulas for approximate amplitudes and tuning widths in response to changing input strength are given and checked numerically. Comparing the network behavior for different nonlinearities, we find that the only rate function (from the class of functions under study) that leads to constant tuning widths and a linear increase of firing rates in response to increasing input is the semilinear function. For other nonlinearities, the qualitative network response depends on whether the model neurons operate in a convex (e.g., x2) or concave (e.g., sqrt (x)) regime of their rate function. In the first case, tuning gradually changes from input driven at low input strength (broad tuning strongly depending on the input and roughly linear amplitudes in response to input strength) to recurrently driven at moderate input strength (sharp tuning, supra-linear increase of amplitudes in response to input strength). For concave rate functions, the network reveals stable hysteresis between a state at low firing rates and a tuned state at high rates. This means that the network can “memorize” tuning properties of a previously shown stimulus. Sigmoid rate functions can combine both effects. In contrast to the Carandini-Ringach model, the two-field model further reveals oscillations with typical frequencies in the beta and gamma range, when the excitatory and inhibitory connections are relatively strong. This suggests a rhythmic modulation of tuning properties during cortical oscillations.


Sign in / Sign up

Export Citation Format

Share Document