HIERARCHICAL MODULARITY OF THE FUNCTIONAL NEURAL NETWORK ORGANIZED BY SPIKE TIMING DEPENDENT SYNAPTIC PLASTICITY

2007 ◽  
Vol 21 (23n24) ◽  
pp. 4124-4129
Author(s):  
CHANG-WOO SHIN ◽  
SEUNGHWAN KIM

We study the emergent functional neural network organized by synaptic reorganization by the spike timing dependent synaptic plasticity (STDP). We show that small-world and scale-free functional structures organized by STDP, in the case of synaptic balance, exhibit hierarchial modularity.

2010 ◽  
Vol 22 (8) ◽  
pp. 2059-2085 ◽  
Author(s):  
Daniel Bush ◽  
Andrew Philippides ◽  
Phil Husbands ◽  
Michael O'Shea

Rate-coded Hebbian learning, as characterized by the BCM formulation, is an established computational model of synaptic plasticity. Recently it has been demonstrated that changes in the strength of synapses in vivo can also depend explicitly on the relative timing of pre- and postsynaptic firing. Computational modeling of this spike-timing-dependent plasticity (STDP) has demonstrated that it can provide inherent stability or competition based on local synaptic variables. However, it has also been demonstrated that these properties rely on synaptic weights being either depressed or unchanged by an increase in mean stochastic firing rates, which directly contradicts empirical data. Several analytical studies have addressed this apparent dichotomy and identified conditions under which distinct and disparate STDP rules can be reconciled with rate-coded Hebbian learning. The aim of this research is to verify, unify, and expand on these previous findings by manipulating each element of a standard computational STDP model in turn. This allows us to identify the conditions under which this plasticity rule can replicate experimental data obtained using both rate and temporal stimulation protocols in a spiking recurrent neural network. Our results describe how the relative scale of mean synaptic weights and their dependence on stochastic pre- or postsynaptic firing rates can be manipulated by adjusting the exact profile of the asymmetric learning window and temporal restrictions on spike pair interactions respectively. These findings imply that previously disparate models of rate-coded autoassociative learning and temporally coded heteroassociative learning, mediated by symmetric and asymmetric connections respectively, can be implemented in a single network using a single plasticity rule. However, we also demonstrate that forms of STDP that can be reconciled with rate-coded Hebbian learning do not generate inherent synaptic competition, and thus some additional mechanism is required to guarantee long-term input-output selectivity.


2018 ◽  
Author(s):  
Sang-Yoon Kim ◽  
Woochang Lim

We are concerned about burst synchronization (BS), related to neural information processes in health and disease, in the Barabasi-Albert scale-free network (SFN) composed of inhibitory bursting Hindmarsh-Rose neurons. This inhibitory neuronal population has adaptive dynamic synaptic strengths governed by the inhibitory spike-timing-dependent plasticity (iSTDP). In previous works without considering iSTDP, BS was found to appear in a range of noise intensities for fixed synaptic inhibition strengths. In contrast, in our present work, we take into consideration iSTDP and investigate its effect on BS by varying the noise intensity. Our new main result is to find occurrence of a Matthew effect in inhibitory synaptic plasticity: good BS gets better via LTD, while bad BS get worse via LTP. This kind of Matthew effect in inhibitory synaptic plasticity is in contrast to that in excitatory synaptic plasticity where good (bad) synchronization gets better (worse) via LTP (LTD). We note that, due to inhibition, the roles of LTD and LTP in inhibitory synaptic plasticity are reversed in comparison with those in excitatory synaptic plasticity. Moreover, emergences of LTD and LTP of synaptic inhibition strengths are intensively investigated via a microscopic method based on the distributions of time delays between the preand the post-synaptic burst onset times. Finally, in the presence of iSTDP we investigate the effects of network architecture on BS by varying the symmetric attachment degree l* and the asymmetry parameter Δl in the SFN.


PLoS ONE ◽  
2020 ◽  
Vol 15 (12) ◽  
pp. e0244683
Author(s):  
Lei Guo ◽  
Enyu Kan ◽  
Youxi Wu ◽  
Huan Lv ◽  
Guizhi Xu

With the continuous improvement of automation and informatization, the electromagnetic environment has become increasingly complex. Traditional protection methods for electronic systems are facing with serious challenges. Biological nervous system has the self-adaptive advantages under the regulation of the nervous system. It is necessary to explore a new thought on electromagnetic protection by drawing from the self-adaptive advantage of the biological nervous system. In this study, the scale-free spiking neural network (SFSNN) is constructed, in which the Izhikevich neuron model is employed as a node, and the synaptic plasticity model including excitatory and inhibitory synapses is employed as an edge. Under white Gaussian noise, the noise suppression abilities of the SFSNNs with the high average clustering coefficient (ACC) and the SFSNNs with the low ACC are studied comparatively. The noise suppression mechanism of the SFSNN is explored. The experiment results demonstrate that the following. (1) The SFSNN has a certain degree of noise suppression ability, and the SFSNNs with the high ACC have higher noise suppression performance than the SFSNNs with the low ACC. (2) The neural information processing of the SFSNN is the linkage effect of dynamic changes in neuron firing, synaptic weight and topological characteristics. (3) The synaptic plasticity is the intrinsic factor of the noise suppression ability of the SFSNN.


2021 ◽  
Author(s):  
Jacopo Bono ◽  
Sara Zannone ◽  
Victor Pedrosa ◽  
Claudia Clopath

AbstractWe describe a framework where a biologically plausible spiking neural network mimicking hippocampal layers learns a cognitive map known as the successor representation. We show analytically how, on the algorithmic level, the learning follows the TD(λ) algorithm, which emerges from the underlying spike-timing dependent plasticity rule. We then analyze the implications of this framework, uncovering how behavioural activity and experience replays can play complementary roles when learning the representation of the environment, how we can learn relations over behavioural timescales with synaptic plasticity acting on the range of milliseconds, and how the learned representation can be flexibly encoded by allowing state-dependent delay discounting through neuromodulation and altered firing rates.


In this paper spiking neural network (SNN) is presented which can discriminate odor data. Spike timing dependent synaptic plasticity (STDP) means a plasticity which is controlled by the presynaptic and postsynaptic spikes time difference. Using this STDP rule the synaptic weights are modified after the mitral and before the cortical cells. In order to determine whether the circuit has correctly identified the odor the SNN has either a high or a low response at the output for any odor given as the input.


2018 ◽  
Author(s):  
Sang-Yoon Kim ◽  
Woochang Lim

We consider the Watts-Strogatz small-world network (SWN) consisting of inhibitory fast spiking Izhikevich interneurons. This inhibitory neuronal population has adaptive dynamic synaptic strengths governed by the inhibitory spike-timing-dependent plasticity (iSTDP). In previous works without iSTDP, fast sparsely synchronized rhythms, associated with diverse cognitive functions, were found to appear in a range of large noise intensities for fixed strong synaptic inhibition strengths. Here, we investigate the effect of iSTDP on fast sparse synchronization (FSS) by varying the noise intensity D. We employ an asymmetric anti-Hebbian time window for the iSTDP update rule [which is in contrast to the Hebbian time window for the excitatory STDP (eSTDP)]. Depending on values of D, population-averaged values of saturated synaptic inhibition strengths are potentiated [long-term potentiation (LTP)] or depressed [long-term depression (LTD)] in comparison with the initial mean value, and dispersions from the mean values of LTP/LTD are much increased when compared with the initial dispersion, independently of D. In most cases of LTD where the effect of mean LTD is dominant in comparison with the effect of dispersion, good synchronization (with higher spiking measure) is found to get better via LTD, while bad synchronization (with lower spiking measure) is found to get worse via LTP. This kind of Matthew effect in inhibitory synaptic plasticity is in contrast to that in excitatory synaptic plasticity where good (bad) synchronization gets better (worse) via LTP (LTD). Emergences of LTD and LTP of synaptic inhibition strengths are intensively investigated via a microscopic method based on the distributions of time delays between the pre- and the post-synaptic spike times. Furthermore, we also investigate the effects of network architecture on FSS by changing the rewiring probability p of the SWN in the presence of iSTDP.


Sign in / Sign up

Export Citation Format

Share Document