synaptic noise
Recently Published Documents


TOTAL DOCUMENTS

110
(FIVE YEARS 10)

H-INDEX

27
(FIVE YEARS 1)

2021 ◽  
Vol 17 (12) ◽  
pp. e1009639
Author(s):  
Lou Zonca ◽  
David Holcman

Rhythmic neuronal network activity underlies brain oscillations. To investigate how connected neuronal networks contribute to the emergence of the α-band and to the regulation of Up and Down states, we study a model based on synaptic short-term depression-facilitation with afterhyperpolarization (AHP). We found that the α-band is generated by the network behavior near the attractor of the Up-state. Coupling inhibitory and excitatory networks by reciprocal connections leads to the emergence of a stable α-band during the Up states, as reflected in the spectrogram. To better characterize the emergence and stability of thalamocortical oscillations containing α and δ rhythms during anesthesia, we model the interaction of two excitatory networks with one inhibitory network, showing that this minimal topology underlies the generation of a persistent α-band in the neuronal voltage characterized by dominant Up over Down states. Finally, we show that the emergence of the α-band appears when external inputs are suppressed, while fragmentation occurs at small synaptic noise or with increasing inhibitory inputs. To conclude, α-oscillations could result from the synaptic dynamics of interacting excitatory neuronal networks with and without AHP, a principle that could apply to other rhythms.


2021 ◽  
Author(s):  
Barbara Feulner ◽  
Matthew G. Perich ◽  
Raeed H. Chowdhury ◽  
Lee E. Miller ◽  
Juan Álvaro Gallego ◽  
...  

Animals can rapidly adapt their movements to external perturbations. This adaptation is paralleled by changes in single neuron activity in the motor cortices. Behavioural and neural recording studies suggest that when animals learn to counteract a visuomotor perturbation, these changes originate from altered inputs to the motor cortices rather than from changes in local connectivity, as neural covariance is largely preserved during adaptation. Since measuring synaptic changes in vivo remains very challenging, we used a modular recurrent network model to compare the expected neural activity changes following learning through altered inputs (Hinput) and learning through local connectivity changes (Hlocal). Learning under Hinput produced small changes in neural activity and largely preserved the neural covariance, in good agreement with neural recordings in monkeys. Surprisingly given the presumed dependence of stable neural covariance on preserved circuit connectivity, Hlocal led to only slightly larger changes in neural activity and covariance compared to Hinput. This similarity is due to Hlocal only requiring small, correlated connectivity changes to counteract the perturbation, which provided the network with significant robustness against simulated synaptic noise. Simulations of tasks that impose increasingly larger behavioural changes revealed a growing difference between Hinput and Hlocal, which could be exploited when designing future experiments.


2021 ◽  
Vol 15 ◽  
Author(s):  
Timothy OIsen ◽  
Alberto Capurro ◽  
Maša Švent ◽  
Nadia Pilati ◽  
Charles Large ◽  
...  

Spontaneous subthreshold activity in the central nervous system is fundamental to information processing and transmission, as it amplifies and optimizes sub-threshold signals, thereby improving action potential initiation and maintaining reliable firing. This form of spontaneous activity, which is frequently considered noise, is particularly important at auditory synapses where acoustic information is encoded by rapid and temporally precise firing rates. In contrast, when present in excess, this form of noise becomes detrimental to acoustic information as it contributes to the generation and maintenance of auditory disorders such as tinnitus. The most prominent contribution to subthreshold noise is spontaneous synaptic transmission (synaptic noise). Although numerous studies have examined the role of synaptic noise on single cell excitability, little is known about its pre-synaptic modulation owing in part to the difficulties of combining noise modulation with monitoring synaptic release. Here we study synaptic noise in the auditory brainstem dorsal cochlear nucleus (DCN) of mice and show that pharmacological potentiation of Kv3 K+ currents reduces the level of synaptic bombardment onto DCN principal fusiform cells. Using a transgenic mouse line (SyG37) expressing SyGCaMP2-mCherry, a calcium sensor that targets pre-synaptic terminals, we show that positive Kv3 K+ current modulation decreases calcium influx in a fifth of pre-synaptic boutons. Furthermore, while maintaining rapid and precise spike timing, positive Kv3 K+ current modulation increases the synchronization of local circuit neurons by reducing spontaneous activity. In conclusion, our study identifies a unique pre-synaptic mechanism which reduces synaptic noise at auditory synapses and contributes to the coherent activation of neurons in a local auditory brainstem circuit. This form of modulation highlights a new therapeutic target, namely the pre-synaptic bouton, for ameliorating the effects of hearing disorders which are dependent on aberrant spontaneous activity within the central auditory system.


2021 ◽  
Vol 15 ◽  
Author(s):  
Wenzhe Guo ◽  
Mohammed E. Fouda ◽  
Ahmed M. Eltawil ◽  
Khaled Nabil Salama

Various hypotheses of information representation in brain, referred to as neural codes, have been proposed to explain the information transmission between neurons. Neural coding plays an essential role in enabling the brain-inspired spiking neural networks (SNNs) to perform different tasks. To search for the best coding scheme, we performed an extensive comparative study on the impact and performance of four important neural coding schemes, namely, rate coding, time-to-first spike (TTFS) coding, phase coding, and burst coding. The comparative study was carried out using a biological 2-layer SNN trained with an unsupervised spike-timing-dependent plasticity (STDP) algorithm. Various aspects of network performance were considered, including classification accuracy, processing latency, synaptic operations (SOPs), hardware implementation, network compression efficacy, input and synaptic noise resilience, and synaptic fault tolerance. The classification tasks on Modified National Institute of Standards and Technology (MNIST) and Fashion-MNIST datasets were applied in our study. For hardware implementation, area and power consumption were estimated for these coding schemes, and the network compression efficacy was analyzed using pruning and quantization techniques. Different types of input noise and noise variations in the datasets were considered and applied. Furthermore, the robustness of each coding scheme to the non-ideality-induced synaptic noise and fault in analog neuromorphic systems was studied and compared. Our results show that TTFS coding is the best choice in achieving the highest computational performance with very low hardware implementation overhead. TTFS coding requires 4x/7.5x lower processing latency and 3.5x/6.5x fewer SOPs than rate coding during the training/inference process. Phase coding is the most resilient scheme to input noise. Burst coding offers the highest network compression efficacy and the best overall robustness to hardware non-idealities for both training and inference processes. The study presented in this paper reveals the design space created by the choice of each coding scheme, allowing designers to frame each scheme in terms of its strength and weakness given a designs’ constraints and considerations in neuromorphic systems.


Author(s):  
Jan Karbowski

AbstractExcitatory synaptic signaling in cortical circuits is thought to be metabolically expensive. Two fundamental brain functions, learning and memory, are associated with long-term synaptic plasticity, but we know very little about energetics of these slow biophysical processes. This study investigates the energy requirement of information storing in plastic synapses for an extended version of BCM plasticity with a decay term, stochastic noise, and nonlinear dependence of neuron’s firing rate on synaptic current (adaptation). It is shown that synaptic weights in this model exhibit bistability. In order to analyze the system analytically, it is reduced to a simple dynamic mean-field for a population averaged plastic synaptic current. Next, using the concepts of nonequilibrium thermodynamics, we derive the energy rate (entropy production rate) for plastic synapses and a corresponding Fisher information for coding presynaptic input. That energy, which is of chemical origin, is primarily used for battling fluctuations in the synaptic weights and presynaptic firing rates, and it increases steeply with synaptic weights, and more uniformly though nonlinearly with presynaptic firing. At the onset of synaptic bistability, Fisher information and memory lifetime both increase sharply, by a few orders of magnitude, but the plasticity energy rate changes only mildly. This implies that a huge gain in the precision of stored information does not have to cost large amounts of metabolic energy, which suggests that synaptic information is not directly limited by energy consumption. Interestingly, for very weak synaptic noise, such a limit on synaptic coding accuracy is imposed instead by a derivative of the plasticity energy rate with respect to the mean presynaptic firing, and this relationship has a general character that is independent of the plasticity type. An estimate for primate neocortex reveals that a relative metabolic cost of BCM type synaptic plasticity, as a fraction of neuronal cost related to fast synaptic transmission and spiking, can vary from negligible to substantial, depending on the synaptic noise level and presynaptic firing.


Author(s):  
Elena Agliari ◽  
Giordano De Marzo

Abstract The retrieval capabilities of associative neural networks are known to be impaired by fast noise, which endows neuron behavior with some degree of stochasticity, and by slow noise, due to interference among stored memories; here, we allow for another source of noise, referred to as “synaptic noise,” which may stem from i. corrupted information provided during learning, ii. shortcomings occurring in the learning stage, or iii. flaws occurring in the storing stage, and which accordingly affects the couplings among neurons. Indeed, we prove that this kind of noise can also yield to a break-down of retrieval and, just like the slow noise, its effect can be softened by relying on density, namely by allowing p-body interactions among neurons.


Author(s):  
Thomas Boraud

This chapter reviews the general principles that are necessary for a neural system to make decisions. A glance at the literature shows that the simplest system to obtain an imbalance between two populations of neurons subjected to the same activation consists of two interconnected populations of inhibitory neurons. These two populations exert lateral inhibition on each other. In order for a differential response to emerge, noise is necessary. Synaptic noise is considered the main source of noise in the nervous system. The chapter then goes on to look at positive feedback. It also studies the learning processes in the nervous system and explores neural plasticity rules, particularly the Hebbian learning rule.


Author(s):  
Thomas Boraud

This chapter highlights the role of noise in decision-making processes, its nature, and its consequences on behaviour and rationality. Noise is a prerequisite for the system to generate a choice, and synaptic noise is considered to be the main source of noise in the nervous system. At the network level, these phenomena can be amplified by bifurcation processes, especially in networks that rely on populations of excitatory and inhibitory neurons interconnected randomly. This bifurcation phenomenon belongs to what physics calls chaotic processes. Associated with stochastic phenomena, bifurcation leads to equilibrium states that can be very far apart. This chapter then goes on to explain that the very basis of the apparent irrationality of behaviour is intrinsic to the properties of the decision-making network. This approach provides an alternative explanation to individual and inter-individual variability in behaviour. The variability of behaviour that results from these processes may provide an evolutionary advantage by allowing individuals of each species to be able to switch from exploitation to exploration behaviour.


2020 ◽  
Vol 6 (25) ◽  
pp. eaba4856
Author(s):  
Guo Zhang ◽  
Ke Yu ◽  
Tao Wang ◽  
Ting-Ting Chen ◽  
Wang-Ding Yuan ◽  
...  

Behavioral variability often arises from variable activity in the behavior-generating neural network. The synaptic mechanisms underlying this variability are poorly understood. We show that synaptic noise, in conjunction with weak feedforward excitation, generates variable motor output in the Aplysia feeding system. A command-like neuron (CBI-10) triggers rhythmic motor programs more variable than programs triggered by CBI-2. CBI-10 weakly excites a pivotal pattern-generating interneuron (B34) strongly activated by CBI-2. The activation properties of B34 substantially account for the degree of program variability. CBI-10– and CBI-2–induced EPSPs in B34 vary in amplitude across trials, suggesting that there is synaptic noise. Computational studies show that synaptic noise is required for program variability. Further, at network state transition points when synaptic conductance is low, maximum program variability is promoted by moderate noise levels. Thus, synaptic strength and noise act together in a nonlinear manner to determine the degree of variability within a feedforward network.


2019 ◽  
Author(s):  
Milad Lankarany

AbstractReliable propagation of firing rate – specifically slow modulation of asynchronous spikes in fairly short time windows [20-500]ms across multiple layers of a feedforward network (FFN) receiving background synaptic noise has proven difficult to capture in spiking models. We, in this paper, explore how information of asynchronous spikes disrupted in the first layer of a typical FFN, and which factors can enable reliable information representation. Our rationale is that the reliable propagation of information across layers of a FFN is likely if that information can be preserved in the first layer of the FFN. In a typical FFN, each layer comprises a certain number (network size) of excitatory neurons – leaky integrate and fire (LIF) model neuron in this paper – receiving correlated input (common stimulus from the upstream layer) plus independent background synaptic noise. We develop a reduced network model of FFN which captures main features of a conventional all-to-all connected FFN. Exploiting the reduced network model, synaptic weights are calculated using a closed-form optimization framework that minimizes the mean squared error between reconstructed stimulus (by spikes of the first layer of FFN) and the original common stimulus. We further explore how representation of asynchronous spikes in a FFN changes with respect to other factors like the network size and the level of background synaptic noise while synaptic weights are optimized for each scenario. We show that not only synaptic weights but also the network size and the level of background synaptic noise are crucial to preserve a reliable representation of asynchronous spikes in the first layer of a FFN. This work sheds light in better understanding of how information of slowly time-varying fluctuations of the firing rate can be transmitted in multi-layered FFNs.


Sign in / Sign up

Export Citation Format

Share Document