Distance- and Direction-Dependent Synaptic Weight Distributions for Directional Spike Propagation in a Recurrent Network: Self-actuated Shutdown of Synaptic Plasticity

Author(s):  
Toshikazu Samura ◽  
Yutaka Sakai ◽  
Hatsuo Hayashi ◽  
Takeshi Aihara
2021 ◽  
Vol 17 (4) ◽  
pp. 1-21
Author(s):  
He Wang ◽  
Nicoleta Cucu Laurenciu ◽  
Yande Jiang ◽  
Sorin Cotofana

Design and implementation of artificial neuromorphic systems able to provide brain akin computation and/or bio-compatible interfacing ability are crucial for understanding the human brain’s complex functionality and unleashing brain-inspired computation’s full potential. To this end, the realization of energy-efficient, low-area, and bio-compatible artificial synapses, which sustain the signal transmission between neurons, is of particular interest for any large-scale neuromorphic system. Graphene is a prime candidate material with excellent electronic properties, atomic dimensions, and low-energy envelope perspectives, which was already proven effective for logic gates implementations. Furthermore, distinct from any other materials used in current artificial synapse implementations, graphene is biocompatible, which offers perspectives for neural interfaces. In view of this, we investigate the feasibility of graphene-based synapses to emulate various synaptic plasticity behaviors and look into their potential area and energy consumption for large-scale implementations. In this article, we propose a generic graphene-based synapse structure, which can emulate the fundamental synaptic functionalities, i.e., Spike-Timing-Dependent Plasticity (STDP) and Long-Term Plasticity . Additionally, the graphene synapse is programable by means of back-gate bias voltage and can exhibit both excitatory or inhibitory behavior. We investigate its capability to obtain different potentiation/depression time scale for STDP with identical synaptic weight change amplitude when the input spike duration varies. Our simulation results, for various synaptic plasticities, indicate that a maximum 30% synaptic weight change and potentiation/depression time scale range from [-1.5 ms, 1.1 ms to [-32.2 ms, 24.1 ms] are achievable. We further explore the effect of our proposal at the Spiking Neural Network (SNN) level by performing NEST-based simulations of a small SNN implemented with 5 leaky-integrate-and-fire neurons connected via graphene-based synapses. Our experiments indicate that the number of SNN firing events exhibits a strong connection with the synaptic plasticity type, and monotonously varies with respect to the input spike frequency. Moreover, for graphene-based Hebbian STDP and spike duration of 20ms we obtain an SNN behavior relatively similar with the one provided by the same SNN with biological STDP. The proposed graphene-based synapse requires a small area (max. 30 nm 2 ), operates at low voltage (200 mV), and can emulate various plasticity types, which makes it an outstanding candidate for implementing large-scale brain-inspired computation systems.


2010 ◽  
Vol 68 ◽  
pp. e436
Author(s):  
Takaaki Aoki ◽  
Yuri Kamitani ◽  
Toshio Aoyagi

1998 ◽  
Vol 10 (3) ◽  
pp. 529-547 ◽  
Author(s):  
Kenneth D. Miller

A simple model of correlation-based synaptic plasticity via axonal sprouting and retraction (Elliott, Howarth, & Shadbolt, 1996a) is shown to be equivalent to the class of correlation-based models (Miller, Keller, & Stryker, 1989), although these were formulated in terms of weight modification of anatomically fixed synapses. Both models maximize the same measure of synaptic correlation, subject to certain constraints on connectivity. Thus, the analyses of the correlation-based models suffice to characterize the behavior of the sprouting-and-retraction model. More detailed models are needed for theoretical distinctions to be drawn between plasticity via sprouting and retraction, weight modification, or a combination. The model of Elliott et al. involves stochastic search through allowed weight patterns for those that improve correlations. That of Miller et alinstead follows dynamical equations that determine continuous changes of the weights that improve correlations. The identity of these two approaches is shown to depend on the use of subtractive constraint enforcement in the models of Miller et al. More generally, to model the idea that neural development acts to maximize some measure of correlation subject to a constraint on the summed synaptic weight, the constraint must be enforced subtractively in a dynamical model.


2021 ◽  
Author(s):  
Miriam Bell ◽  
Padmini Rangamani

Synaptic plasticity involves the modification of both biochemical and structural components of neurons. Many studies have revealed that the change in the number density of the glutamatergic receptor AMPAR at the synapse is proportional to synaptic weight update; increase in AMPAR corresponds to strengthening of synapses while decrease in AMPAR density weakens synaptic connections. The dynamics of AMPAR are thought to be regulated by upstream signaling, primarily the calcium-CaMKII pathway, trafficking to and from the synapse, and influx from extrasynaptic sources. Here, we have developed a set of models using compartmental ordinary differential equations to systematically investigate contributions of signaling and trafficking variations on AMPAR dynamics at the synaptic site. We find that the model properties including network architecture and parameters significantly affect the integration of fast upstream species by slower downstream species. Furthermore, we predict that the model outcome, as determined by bound AMPAR at the synaptic site, depends on (a) the choice of signaling model (bistable CaMKII or monostable CaMKII dynamics), (b) trafficking versus influx contributions, and (c) frequency of stimulus. Therefore, AMPAR dynamics can have unexpected dependencies when upstream signaling dynamics (such as CaMKII and PP1) are coupled with trafficking modalities.


eLife ◽  
2021 ◽  
Vol 10 ◽  
Author(s):  
Aaron D Milstein ◽  
Yiding Li ◽  
Katie C Bittner ◽  
Christine Grienberger ◽  
Ivan Soltesz ◽  
...  

Learning requires neural adaptations thought to be mediated by activity-dependent synaptic plasticity. A relatively non-standard form of synaptic plasticity driven by dendritic calcium spikes, or plateau potentials, has been reported to underlie place field formation in rodent hippocampal CA1 neurons. Here we found that this behavioral timescale synaptic plasticity (BTSP) can also reshape existing place fields via bidirectional synaptic weight changes that depend on the temporal proximity of plateau potentials to pre-existing place fields. When evoked near an existing place field, plateau potentials induced less synaptic potentiation and more depression, suggesting BTSP might depend inversely on postsynaptic activation. However, manipulations of place cell membrane potential and computational modeling indicated that this anti-correlation actually results from a dependence on current synaptic weight such that weak inputs potentiate and strong inputs depress. A network model implementing this bidirectional synaptic learning rule suggested that BTSP enables population activity, rather than pairwise neuronal correlations, to drive neural adaptations to experience.


2009 ◽  
Vol 21 (12) ◽  
pp. 3408-3428 ◽  
Author(s):  
Christian Leibold ◽  
Michael H. K. Bendels

Short-term synaptic plasticity is modulated by long-term synaptic changes. There is, however, no general agreement on the computational role of this interaction. Here, we derive a learning rule for the release probability and the maximal synaptic conductance in a circuit model with combined recurrent and feedforward connections that allows learning to discriminate among natural inputs. Short-term synaptic plasticity thereby provides a nonlinear expansion of the input space of a linear classifier, whereas the random recurrent network serves to decorrelate the expanded input space. Computer simulations reveal that the twofold increase in the number of input dimensions through short-term synaptic plasticity improves the performance of a standard perceptron up to 100%. The distributions of release probabilities and maximal synaptic conductances at the capacity limit strongly depend on the balance between excitation and inhibition. The model also suggests a new computational interpretation of spikes evoked by stimuli outside the classical receptive field. These neuronal activities may reflect decorrelation of the expanded stimulus space by intracortical synaptic connections.


2017 ◽  
Author(s):  
Simona Olmi ◽  
David Angulo-Garcia ◽  
Alberto Imparato ◽  
Alessandro Torcini

ABSTRACTNeurons in the intact brain receive a continuous and irregular synaptic bombardment from excitatory and inhibitory presynaptic neurons, which determines the firing activity of the stimulated neuron. In orderto investigate the influence of inhibitory stimulation on the firing time statistics, we consider Leaky Integrate-and-Fire neurons subject to inhibitory instantaneous postsynaptic potentials. In particular, we report exact results for the firing rate, the coefficient of variation and the spike train spectrum for various synaptic weight distributions. Our results are not limited to stimulations of infinitesimal amplitude, but they apply as well to finite amplitude post-synaptic potentials, thus being able to capture the effect of rare and large spikes. The developed methods are able to reproduce also the average firing properties of heterogeneous neuronal populations.


Sign in / Sign up

Export Citation Format

Share Document