scholarly journals Homeostatic structural plasticity leads to the formation of memory engrams through synaptic rewiring in recurrent networks

Author(s):  
Júlia V. Gallinaro ◽  
Nebojša Gašparović ◽  
Stefan Rotter

AbstractBrain networks store new memories using functional and structural synaptic plasticity. Memory formation is generally attributed to Hebbian plasticity, while homeostatic plasticity is thought to have an ancillary role in stabilizing network dynamics. Here we report that homeostatic plasticity alone can also lead to the formation of stable memories. We analyze this phenomenon using a new theory of network remodeling, combined with numerical simulations of recurrent spiking neural networks that exhibit structural plasticity based on firing rate homeostasis. These networks are able to store repeatedly presented patterns and recall them upon the presentation of incomplete cues. Storing is fast, governed by the homeostatic drift. In contrast, forgetting is slow, driven by a diffusion process. Joint stimulation of neurons induces the growth of associative connections between them, leading to the formation of memory engrams. In conclusion, homeostatic structural plasticity induces a specific type of “silent memories”, different from conventional attractor states.

2021 ◽  
Vol 23 (6) ◽  
pp. 317-326
Author(s):  
E.A. Ryndin ◽  
◽  
N.V. Andreeva ◽  
V.V. Luchinin ◽  
K.S. Goncharov ◽  
...  

In the current era, design and development of artificial neural networks exploiting the architecture of the human brain have evolved rapidly. Artificial neural networks effectively solve a wide range of common for artificial intelligence tasks involving data classification and recognition, prediction, forecasting and adaptive control of object behavior. Biologically inspired underlying principles of ANN operation have certain advantages over the conventional von Neumann architecture including unsupervised learning, architectural flexibility and adaptability to environmental change and high performance under significantly reduced power consumption due to heavy parallel and asynchronous data processing. In this paper, we present the circuit design of main functional blocks (neurons and synapses) intended for hardware implementation of a perceptron-based feedforward spiking neural network. As the third generation of artificial neural networks, spiking neural networks perform data processing utilizing spikes, which are discrete events (or functions) that take place at points in time. Neurons in spiking neural networks initiate precisely timing spikes and communicate with each other via spikes transmitted through synaptic connections or synapses with adaptable scalable weight. One of the prospective approach to emulate the synaptic behavior in hardware implemented spiking neural networks is to use non-volatile memory devices with analog conduction modulation (or memristive structures). Here we propose a circuit design for functional analogues of memristive structure to mimic a synaptic plasticity, pre- and postsynaptic neurons which could be used for developing circuit design of spiking neural network architectures with different training algorithms including spike-timing dependent plasticity learning rule. Two different circuits of electronic synapse were developed. The first one is an analog synapse with photoresistive optocoupler used to ensure the tunable conductivity for synaptic plasticity emulation. While the second one is a digital synapse, in which the synaptic weight is stored in a digital code with its direct conversion into conductivity (without digital-to-analog converter andphotoresistive optocoupler). The results of the prototyping of developed circuits for electronic analogues of synapses, pre- and postsynaptic neurons and the study of transient processes are presented. The developed approach could provide a basis for ASIC design of spiking neural networks based on CMOS (complementary metal oxide semiconductor) design technology.


2003 ◽  
Vol 14 (5) ◽  
pp. 980-992 ◽  
Author(s):  
N. Mehrtash ◽  
Dietmar Jung ◽  
H.H. Hellmich ◽  
T. Schoenauer ◽  
Vi Thanh Lu ◽  
...  

2006 ◽  
Vol 18 (1) ◽  
pp. 60-79 ◽  
Author(s):  
Hédi Soula ◽  
Guillaume Beslon ◽  
Olivier Mazet

In this letter, we study the effect of a unique initial stimulation on random recurrent networks of leaky integrate-and-fire neurons. Indeed, given a stochastic connectivity, this so-called spontaneous mode exhibits various nontrivial dynamics. This study is based on a mathematical formalism that allows us to examine the variability of the afterward dynamics according to the parameters of the weight distribution. Under the independence hypothesis (e.g., in the case of very large networks), we are able to compute the average number of neurons that fire at a given time—the spiking activity. In accordance with numerical simulations, we prove that this spiking activity reaches a steady state. We characterize this steady state and explore the transients.


IEEE Access ◽  
2021 ◽  
pp. 1-1
Author(s):  
Mahima Ma Weerasinghe ◽  
Josafath I Espinosa Ramos ◽  
Grace Y Wang ◽  
Dave Parry

Author(s):  
Alejandro Jiménez-Rodríguez ◽  
Luis Fernando Castillo ◽  
Manuel González

In this paper, a mechanism of emotional bias in decision making is studied using Spiking Neural Networks to simulate the associative and recurrent networks involved. The results obtained are along the lines of those proposed by A. Damasio as part of the Somatic Marker Hypothesis, in particular, that, in absence of emotional input, the decision making is driven by the rational input alone. Appropriate representations for the Objective and Emotional Values are also suggested, provided a spike representation (code) of the information.


2020 ◽  
Author(s):  
Alan Eric Akil ◽  
Robert Rosenbaum ◽  
Krešimir Josić

AbstractThe dynamics of local cortical networks are irregular, but correlated. Dynamic excitatory– inhibitory balance is a plausible mechanism that generates such irregular activity, but it remains unclear how balance is achieved and maintained in plastic neural networks. In particular, it is not fully understood how plasticity induced changes in the network affect balance, and in turn, how correlated, balanced activity impacts learning. How does the dynamics of balanced networks change under different plasticity rules? How does correlated spiking activity in recurrent networks change the evolution of weights, their eventual magnitude, and structure across the network? To address these questions, we develop a general theory of plasticity in balanced networks. We show that balance can be attained and maintained under plasticity induced weight changes. We find that correlations in the input mildly, but significantly affect the evolution of synaptic weights. Under certain plasticity rules, we find an emergence of correlations between firing rates and synaptic weights. Under these rules, synaptic weights converge to a stable manifold in weight space with their final configuration dependent on the initial state of the network. Lastly, we show that our framework can also describe the dynamics of plastic balanced networks when subsets of neurons receive targeted optogenetic input.


Author(s):  
Priyadarshini Panda ◽  
Jason M. Allred ◽  
Shriram Ramanathan ◽  
Kaushik Roy

2017 ◽  
Vol 372 (1715) ◽  
pp. 20160259 ◽  
Author(s):  
Friedemann Zenke ◽  
Wulfram Gerstner

We review a body of theoretical and experimental research on Hebbian and homeostatic plasticity, starting from a puzzling observation: while homeostasis of synapses found in experiments is a slow compensatory process, most mathematical models of synaptic plasticity use rapid compensatory processes (RCPs). Even worse, with the slow homeostatic plasticity reported in experiments, simulations of existing plasticity models cannot maintain network stability unless further control mechanisms are implemented. To solve this paradox, we suggest that in addition to slow forms of homeostatic plasticity there are RCPs which stabilize synaptic plasticity on short timescales. These rapid processes may include heterosynaptic depression triggered by episodes of high postsynaptic firing rate. While slower forms of homeostatic plasticity are not sufficient to stabilize Hebbian plasticity, they are important for fine-tuning neural circuits. Taken together we suggest that learning and memory rely on an intricate interplay of diverse plasticity mechanisms on different timescales which jointly ensure stability and plasticity of neural circuits. This article is part of the themed issue ‘Integrating Hebbian and homeostatic plasticity’.


Sign in / Sign up

Export Citation Format

Share Document