Self-organization of a recurrent network under ongoing synaptic plasticity

2015 ◽  
Vol 62 ◽  
pp. 11-19 ◽  
Author(s):  
Takaaki Aoki
2005 ◽  
Vol 8 (11) ◽  
pp. 1418-1420 ◽  
Author(s):  
György Buzsáki ◽  
James J Chrobak

2010 ◽  
Vol 68 ◽  
pp. e436
Author(s):  
Takaaki Aoki ◽  
Yuri Kamitani ◽  
Toshio Aoyagi

2009 ◽  
Vol 21 (12) ◽  
pp. 3408-3428 ◽  
Author(s):  
Christian Leibold ◽  
Michael H. K. Bendels

Short-term synaptic plasticity is modulated by long-term synaptic changes. There is, however, no general agreement on the computational role of this interaction. Here, we derive a learning rule for the release probability and the maximal synaptic conductance in a circuit model with combined recurrent and feedforward connections that allows learning to discriminate among natural inputs. Short-term synaptic plasticity thereby provides a nonlinear expansion of the input space of a linear classifier, whereas the random recurrent network serves to decorrelate the expanded input space. Computer simulations reveal that the twofold increase in the number of input dimensions through short-term synaptic plasticity improves the performance of a standard perceptron up to 100%. The distributions of release probabilities and maximal synaptic conductances at the capacity limit strongly depend on the balance between excitation and inhibition. The model also suggests a new computational interpretation of spikes evoked by stimuli outside the classical receptive field. These neuronal activities may reflect decorrelation of the expanded stimulus space by intracortical synaptic connections.


2013 ◽  
Vol 4 (1) ◽  
pp. 19-28 ◽  
Author(s):  
Sharadindu Roy ◽  
Prof Samer Sen Sarma ◽  
Soumyadip Chakravorty ◽  
Suvodip Maity

Abstract This paper represents TSP (Travelling Salesman Problem) by using Artificial Neural Networks.A comparative study of various methods of ANN is shown here for solving TSP problem.The Travelling Salesman Problem is a classical combinational optimization problem, which is a simple to state but very difficult to solve. This problem is to find the shortest possible tour through a set of N vertices so that each vertex is visited exactly once. TSP can be solved by Hopfield Network, Self-organization Map, and Simultaneous Recurrent Network. Hopfield net is a fully connected network, where every vertex is connected with each other forwardly and backwardly. So starting the walk from a vertex we can travel all the other vertex exactly once and return to starting vertex again.


2021 ◽  
Vol 9 ◽  
Author(s):  
Roxana Zeraati ◽  
Viola Priesemann ◽  
Anna Levina

Self-organized criticality has been proposed to be a universal mechanism for the emergence of scale-free dynamics in many complex systems, and possibly in the brain. While such scale-free patterns were identified experimentally in many different types of neural recordings, the biological principles behind their emergence remained unknown. Utilizing different network models and motivated by experimental observations, synaptic plasticity was proposed as a possible mechanism to self-organize brain dynamics toward a critical point. In this review, we discuss how various biologically plausible plasticity rules operating across multiple timescales are implemented in the models and how they alter the network’s dynamical state through modification of number and strength of the connections between the neurons. Some of these rules help to stabilize criticality, some need additional mechanisms to prevent divergence from the critical state. We propose that rules that are capable of bringing the network to criticality can be classified by how long the near-critical dynamics persists after their disabling. Finally, we discuss the role of self-organization and criticality in computation. Overall, the concept of criticality helps to shed light on brain function and self-organization, yet the overall dynamics of living neural networks seem to harnesses not only criticality for computation, but also deviations thereof.


2018 ◽  
Author(s):  
Yann Sweeney ◽  
Claudia Clopath

AbstractLong-term imaging of sensory cortex reveals a diverse range of stimulus response stability: some neurons retain stimulus responses that are stable over days whereas other neurons have highly plastic stimulus responses. Using a recurrent network model, we explore whether this observation could be due to an underlying diversity in the synaptic plasticity of neurons. We find that, in a network with diverse learning rates, neurons with fast rates are more coupled to population activity than neurons with slow rates. This phenomenon, which we call a plasticity-coupling link, surprisingly predicts that neurons with high population coupling exhibit more long-term stimulus response variability than neurons with low population coupling. We substantiate this prediction using recordings from the Allen Brain Observatory which track the orientation preferences of 15,000 neurons in mouse visual cortex. In agreement with our model, a neuron’s population coupling is correlated with the plasticity of its orientation preference. Finally, we show that high population coupling helps plastic neurons alter their stimulus preference during a simple perceptual learning task, but hinders the ability of stable neurons to provide an instructive signal for learning. This suggests a particular functional architecture: a stable ‘backbone’ of stimulus representation formed by neurons with slow synaptic plasticity and low population coupling, on top of which lies a flexible substrate of neurons with fast synaptic plasticity and high population coupling.


2015 ◽  
Vol 16 (S1) ◽  
Author(s):  
Sakyasingha Dasgupta ◽  
Christian Tetzlaff ◽  
Tomas Kulvicius ◽  
Florentin Wörgötter

Sign in / Sign up

Export Citation Format

Share Document