scholarly journals Crosstalk between biochemical signaling and trafficking determines AMPAR dynamics in synaptic plasticity

2021 ◽  
Author(s):  
Miriam Bell ◽  
Padmini Rangamani

Synaptic plasticity involves the modification of both biochemical and structural components of neurons. Many studies have revealed that the change in the number density of the glutamatergic receptor AMPAR at the synapse is proportional to synaptic weight update; increase in AMPAR corresponds to strengthening of synapses while decrease in AMPAR density weakens synaptic connections. The dynamics of AMPAR are thought to be regulated by upstream signaling, primarily the calcium-CaMKII pathway, trafficking to and from the synapse, and influx from extrasynaptic sources. Here, we have developed a set of models using compartmental ordinary differential equations to systematically investigate contributions of signaling and trafficking variations on AMPAR dynamics at the synaptic site. We find that the model properties including network architecture and parameters significantly affect the integration of fast upstream species by slower downstream species. Furthermore, we predict that the model outcome, as determined by bound AMPAR at the synaptic site, depends on (a) the choice of signaling model (bistable CaMKII or monostable CaMKII dynamics), (b) trafficking versus influx contributions, and (c) frequency of stimulus. Therefore, AMPAR dynamics can have unexpected dependencies when upstream signaling dynamics (such as CaMKII and PP1) are coupled with trafficking modalities.

2021 ◽  
Vol 17 (4) ◽  
pp. 1-21
Author(s):  
He Wang ◽  
Nicoleta Cucu Laurenciu ◽  
Yande Jiang ◽  
Sorin Cotofana

Design and implementation of artificial neuromorphic systems able to provide brain akin computation and/or bio-compatible interfacing ability are crucial for understanding the human brain’s complex functionality and unleashing brain-inspired computation’s full potential. To this end, the realization of energy-efficient, low-area, and bio-compatible artificial synapses, which sustain the signal transmission between neurons, is of particular interest for any large-scale neuromorphic system. Graphene is a prime candidate material with excellent electronic properties, atomic dimensions, and low-energy envelope perspectives, which was already proven effective for logic gates implementations. Furthermore, distinct from any other materials used in current artificial synapse implementations, graphene is biocompatible, which offers perspectives for neural interfaces. In view of this, we investigate the feasibility of graphene-based synapses to emulate various synaptic plasticity behaviors and look into their potential area and energy consumption for large-scale implementations. In this article, we propose a generic graphene-based synapse structure, which can emulate the fundamental synaptic functionalities, i.e., Spike-Timing-Dependent Plasticity (STDP) and Long-Term Plasticity . Additionally, the graphene synapse is programable by means of back-gate bias voltage and can exhibit both excitatory or inhibitory behavior. We investigate its capability to obtain different potentiation/depression time scale for STDP with identical synaptic weight change amplitude when the input spike duration varies. Our simulation results, for various synaptic plasticities, indicate that a maximum 30% synaptic weight change and potentiation/depression time scale range from [-1.5 ms, 1.1 ms to [-32.2 ms, 24.1 ms] are achievable. We further explore the effect of our proposal at the Spiking Neural Network (SNN) level by performing NEST-based simulations of a small SNN implemented with 5 leaky-integrate-and-fire neurons connected via graphene-based synapses. Our experiments indicate that the number of SNN firing events exhibits a strong connection with the synaptic plasticity type, and monotonously varies with respect to the input spike frequency. Moreover, for graphene-based Hebbian STDP and spike duration of 20ms we obtain an SNN behavior relatively similar with the one provided by the same SNN with biological STDP. The proposed graphene-based synapse requires a small area (max. 30 nm 2 ), operates at low voltage (200 mV), and can emulate various plasticity types, which makes it an outstanding candidate for implementing large-scale brain-inspired computation systems.


Author(s):  
Sivaganesan S ◽  
Maria Antony S ◽  
Udayakumar E

A hybrid analog/digital very large-scale integration (VLSI) implementation of a spiking neural network with programmable synaptic weights was designed. The synaptic weight values are stored in an asynchronous module, which is interfaced to a fast current-mode event-driven DAC for producing synaptic currents with the appropriate amplitude values. It acts as a transceiver, receiving asynchronous events for input, performing neural computations with hybrid analog/digital circuits on the input spikes, and eventually producing digital asynchronous events in output. Input, output, and synaptic weight values are transmitted to/from the chip using a common communication protocol based on the address event representation (AER). Using this representation, it is possible to interface the device to a workstation or a microcontroller and explore the effect of different types of spike-timing dependent plasticity (STDP) learning algorithms for updating the synaptic weights values in the CAM module.


2019 ◽  
Vol 21 (1) ◽  
pp. 143 ◽  
Author(s):  
Mario Stampanoni Bassi ◽  
Ennio Iezzi ◽  
Luigi Pavone ◽  
Georgia Mandolesi ◽  
Alessandra Musella ◽  
...  

Multiple sclerosis (MS) is a chronic inflammatory disease of the central nervous system (CNS) characterized by demyelinating white matter lesions and neurodegeneration, with a variable clinical course. Brain network architecture provides efficient information processing and resilience to damage. The peculiar organization characterized by a low number of highly connected nodes (hubs) confers high resistance to random damage. Anti-homeostatic synaptic plasticity, in particular long-term potentiation (LTP), represents one of the main physiological mechanisms underlying clinical recovery after brain damage. Different types of synaptic plasticity, including both anti-homeostatic and homeostatic mechanisms (synaptic scaling), contribute to shape brain networks. In MS, altered synaptic functioning induced by inflammatory mediators may represent a further cause of brain network collapse in addition to demyelination and grey matter atrophy. We propose that impaired LTP expression and pathologically enhanced upscaling may contribute to disrupting brain network topology in MS, weakening resilience to damage and negatively influencing the disease course.


Processes ◽  
2020 ◽  
Vol 8 (4) ◽  
pp. 414
Author(s):  
Robert Dürr ◽  
Andreas Bück

Population balance modeling is an established framework to describe the dynamics of particle populations in disperse phase systems found in a broad field of industrial, civil, and medical applications. The resulting population balance equations account for the dynamics of the number density distribution functions and represent (systems of) partial differential equations which require sophisticated numerical solution techniques due to the general lack of analytical solutions. A specific class of solution algorithms, so-called moment methods, is based on the reduction of complex models to a set of ordinary differential equations characterizing dynamics of integral quantities of the number density distribution function. However, in general, a closed set of moment equations is not found and one has to rely on approximate closure methods. In this contribution, a concise overview of the most prominent approximate moment methods is given.


2017 ◽  
Author(s):  
Naoki Hiratani ◽  
Tomoki Fukai

AbstractRecent experimental studies suggest that, in cortical microcircuits of the mammalian brain, the majority of neuron-to-neuron connections are realized by multiple synapses. However, it is not known whether such redundant synaptic connections provide any functional benefit. Here, we show that redundant synaptic connections enable near-optimal learning in cooperation with synaptic rewiring. By constructing a simple dendritic neuron model, we demonstrate that with multisynaptic connections, synaptic plasticity approximates a sample-based Bayesian filtering algorithm known as particle filtering, and wiring plasticity implements its resampling process. Applying the proposed framework to a detailed single neuron model, we show that the model accounts for many experimental observations, including the dendritic position dependence of spike-timing-dependent plasticity, and the functional synaptic organization on the dendritic tree based on the stimulus selectivity of presynaptic neurons. Our study provides a novel conceptual framework for synaptic plasticity and rewiring.


2020 ◽  
Author(s):  
Daniel Udvary ◽  
Philipp Harth ◽  
Jakob H. Macke ◽  
Hans-Christian Hege ◽  
Christiaan P.J. de Kock ◽  
...  

Developmental programs that guide neurons and their neurites into specific subvolumes of the mammalian neocortex give rise to lifelong constraints for the formation of synaptic connections. To what degree do these constraints affect cortical wiring diagrams? Here we introduce an inverse modeling approach to show how cortical networks would appear if they were solely due to the spatial distributions of neurons and neurites. We find that neurite packing density and morphological diversity will inevitably translate into non-random pairwise and higher-order connectivity statistics. More importantly, we show that these non-random wiring properties are not arbitrary, but instead reflect the specific structural organization of the underlying neuropil. Our predictions are consistent with the empirically observed wiring specificity from subcellular to network scales. Thus, independent from learning and genetically encoded wiring rules, many of the properties that define the neocortex’ characteristic network architecture may emerge as a result of neuron and neurite development.


2018 ◽  
Author(s):  
Sang-Yoon Kim ◽  
Woochang Lim

We are concerned about burst synchronization (BS), related to neural information processes in health and disease, in the Barabasi-Albert scale-free network (SFN) composed of inhibitory bursting Hindmarsh-Rose neurons. This inhibitory neuronal population has adaptive dynamic synaptic strengths governed by the inhibitory spike-timing-dependent plasticity (iSTDP). In previous works without considering iSTDP, BS was found to appear in a range of noise intensities for fixed synaptic inhibition strengths. In contrast, in our present work, we take into consideration iSTDP and investigate its effect on BS by varying the noise intensity. Our new main result is to find occurrence of a Matthew effect in inhibitory synaptic plasticity: good BS gets better via LTD, while bad BS get worse via LTP. This kind of Matthew effect in inhibitory synaptic plasticity is in contrast to that in excitatory synaptic plasticity where good (bad) synchronization gets better (worse) via LTP (LTD). We note that, due to inhibition, the roles of LTD and LTP in inhibitory synaptic plasticity are reversed in comparison with those in excitatory synaptic plasticity. Moreover, emergences of LTD and LTP of synaptic inhibition strengths are intensively investigated via a microscopic method based on the distributions of time delays between the preand the post-synaptic burst onset times. Finally, in the presence of iSTDP we investigate the effects of network architecture on BS by varying the symmetric attachment degree l* and the asymmetry parameter Δl in the SFN.


Author(s):  
R. М. Peleshchak ◽  
V. V. Lytvyn ◽  
О. І. Cherniak ◽  
І. R. Peleshchak ◽  
М. V. Doroshenko

Context. To reduce the computational resource time in the problems of diagnosing and recognizing distorted images based on a fully connected stochastic pseudospin neural network, it becomes necessary to thin out synaptic connections between neurons, which is solved using the method of diagonalizing the matrix of synaptic connections without losing interaction between all neurons in the network. Objective. To create an architecture of a stochastic pseudo-spin neural network with diagonal synaptic connections without loosing the interaction between all the neurons in the layer to reduce its learning time. Method. The paper uses the Hausholder method, the method of compressing input images based on the diagonalization of the matrix of synaptic connections and the computer mathematics system MATLAB for converting a fully connected neural network into a tridiagonal form with hidden synaptic connections between all neurons. Results. We developed a model of a stochastic neural network architecture with sparse renormalized synaptic connections that take into account deleted synaptic connections. Based on the transformation of the synaptic connection matrix of a fully connected neural network into a Hessenberg matrix with tridiagonal synaptic connections, we proposed a renormalized local Hebb rule. Using the computer mathematics system “WolframMathematica 11.3”, we calculated, as a function of the number of neurons N, the relative tuning time of synaptic connections (per iteration) in a stochastic pseudospin neural network with a tridiagonal connection Matrix, relative to the tuning time of synaptic connections (per iteration) in a fully connected synaptic neural network. Conclusions. We found that with an increase in the number of neurons, the tuning time of synaptic connections (per iteration) in a stochastic pseudospin neural network with a tridiagonal connection Matrix, relative to the tuning time of synaptic connections (per iteration) in a fully connected synaptic neural network, decreases according to a hyperbolic law. Depending on the direction of pseudospin neurons, we proposed a classification of a renormalized neural network with a ferromagnetic structure, an antiferromagnetic structure, and a dipole glass.


Sign in / Sign up

Export Citation Format

Share Document