continuous attractors
Recently Published Documents


TOTAL DOCUMENTS

40
(FIVE YEARS 9)

H-INDEX

9
(FIVE YEARS 1)

eLife ◽  
2021 ◽  
Vol 10 ◽  
Author(s):  
Davide Spalla ◽  
Isabel Maria Cornacchia ◽  
Alessandro Treves

Episodic memory has a dynamic nature: when we recall past episodes, we retrieve not only their content, but also their temporal structure. The phenomenon of replay, in the hippocampus of mammals, offers a remarkable example of this temporal dynamics. However, most quantitative models of memory treat memories as static configurations, neglecting the temporal unfolding of the retrieval process. Here, we introduce a continuous attractor network model with a memory-dependent asymmetric component in the synaptic connectivity, which spontaneously breaks the equilibrium of the memory configurations and produces dynamic retrieval. The detailed analysis of the model with analytical calculations and numerical simulations shows that it can robustly retrieve multiple dynamical memories, and that this feature is largely independent of the details of its implementation. By calculating the storage capacity, we show that the dynamic component does not impair memory capacity, and can even enhance it in certain regimes.


2021 ◽  
Vol 15 ◽  
Author(s):  
Ian D. Jordan ◽  
Piotr Aleksander Sokół ◽  
Il Memming Park

Gated recurrent units (GRUs) are specialized memory elements for building recurrent neural networks. Despite their incredible success on various tasks, including extracting dynamics underlying neural data, little is understood about the specific dynamics representable in a GRU network. As a result, it is both difficult to know a priori how successful a GRU network will perform on a given task, and also their capacity to mimic the underlying behavior of their biological counterparts. Using a continuous time analysis, we gain intuition on the inner workings of GRU networks. We restrict our presentation to low dimensions, allowing for a comprehensive visualization. We found a surprisingly rich repertoire of dynamical features that includes stable limit cycles (nonlinear oscillations), multi-stable dynamics with various topologies, and homoclinic bifurcations. At the same time we were unable to train GRU networks to produce continuous attractors, which are hypothesized to exist in biological neural networks. We contextualize the usefulness of different kinds of observed dynamics and support our claims experimentally.


2021 ◽  
Author(s):  
Jintao Gu ◽  
Sukbin Lim

Working memory is a core component of critical cognitive functions such as planning and decision-making. Persistent activity that lasts long after the stimulus offset has been considered a neural substrate for working memory. Attractor dynamics based on network interactions can successfully reproduce such persistent activity. However, it suffers from a fine-tuning of network connectivity, in particular, to form continuous attractors suggested for working memory encoding analog signals. Here, we investigate whether a specific form of synaptic plasticity rules can mitigate such tuning problems in two representative working memory models, namely, rate-coded and location-coded persistent activity. We consider two prominent types of plasticity rules, differential plasticity targeting the slip of instant neural activity and homeostatic plasticity regularizing the long-term average of activity, both of which have been proposed to fine-tune the weights in an unsupervised manner. Consistent with the findings of previous works, differential plasticity alone was enough to recover a graded-level persistent activity with less sensitivity to learning parameters. However, for the maintenance of spatially structured persistent activity, differential plasticity could recover persistent activity, but its pattern can be irregular for different stimulus locations. On the other hand, homeostatic plasticity shows a robust recovery of smooth spatial patterns under particular types of synaptic perturbations, such as perturbations in incoming synapses onto the entire or local populations, while it was not effective against perturbations in outgoing synapses from local populations. Instead, combining it with differential plasticity recovers location-coded persistent activity for a broader range of perturbations, suggesting compensation between two plasticity rules.


2021 ◽  
Vol 429 ◽  
pp. 25-32
Author(s):  
Wanyu Xiang ◽  
Jiali Yu ◽  
Zhang Yi ◽  
Chunxiao Wang ◽  
Qing Gao ◽  
...  

2021 ◽  
Vol 419 ◽  
pp. 1-8 ◽  
Author(s):  
Wenshuang Chen ◽  
Jiali Yu ◽  
Zhang Yi ◽  
Hong Qu

2020 ◽  
Author(s):  
Davide Spalla ◽  
Isabel M. Cornacchia ◽  
Alessandro Treves

AbstractEpisodic memory has a dynamic nature: when we recall past episodes, we retrieve not only their content, but also their temporal structure. The phenomenon of replay, in the hippocampus of mammals, offers a remarkable example of this temporal dynamics. However, most quantitative models of memory treat memories as static configurations, neglecting the temporal unfolding of the retrieval process. Here we introduce a continuous attractor network model with a memory-dependent asymmetric component in the synaptic connectivity, that spontaneously breaks the equilibrium of the memory configurations and produces dynamic retrieval. The detailed analysis of the model with analytical calculations and numerical simulations shows that it can robustly retrieve multiple dynamical memories, and that this feature is largely independent on the details of its implementation. By calculating the storage capacity we show that the dynamic component does not impair memory capacity, and can even enhance it in certain regimes.


2020 ◽  
Vol 32 (6) ◽  
pp. 1033-1068 ◽  
Author(s):  
Weishun Zhong ◽  
Zhiyue Lu ◽  
David J. Schwab ◽  
Arvind Murugan

Continuous attractors have been used to understand recent neuroscience experiments where persistent activity patterns encode internal representations of external attributes like head direction or spatial location. However, the conditions under which the emergent bump of neural activity in such networks can be manipulated by space and time-dependent external sensory or motor signals are not understood. Here, we find fundamental limits on how rapidly internal representations encoded along continuous attractors can be updated by an external signal. We apply these results to place cell networks to derive a velocity-dependent nonequilibrium memory capacity in neural networks.


2019 ◽  
Vol 97 (12) ◽  
pp. 2462-2473
Author(s):  
Jiali Yu ◽  
Xiong Dai ◽  
Wenshuang Chen ◽  
Chunxiao Wang ◽  
Jin Qi

Sign in / Sign up

Export Citation Format

Share Document