scholarly journals Sleep prevents catastrophic forgetting in spiking neural networks by forming joint synaptic weight representations

2019 ◽  
Author(s):  
Ryan Golden ◽  
Jean Erik Delanois ◽  
Pavel Sanda ◽  
Maxim Bazhenov

AbstractArtificial neural networks overwrite previously learned tasks when trained sequentially, a phenomenon known as catastrophic forgetting. In contrast, the brain learns continuously, and typically learns best when new learning is interleaved with periods of sleep for memory consolidation. In this study, we used spiking network to study mechanisms behind catastrophic forgetting and the role of sleep in preventing it. The network could be trained to learn a complex foraging task but exhibited catastrophic forgetting when trained sequentially on multiple tasks. New task training moved the synaptic weight configuration away from the manifold representing old tasks leading to forgetting. Interleaving new task training with periods of off-line reactivation, mimicking biological sleep, mitigated catastrophic forgetting by pushing the synaptic weight configuration towards the intersection of the solution manifolds representing multiple tasks. The study reveals a possible strategy of synaptic weights dynamics the brain applies during sleep to prevent forgetting and optimize learning.

2020 ◽  
Vol 21 (20) ◽  
pp. 7447
Author(s):  
Amanda M. Leonetti ◽  
Ming Yin Chu ◽  
Fiona O. Ramnaraign ◽  
Samuel Holm ◽  
Brandon J. Walters

Investigation into the role of methylation of the adenosine base (m6A) of RNA has only recently begun, but it quickly became apparent that m6A is able to control and fine-tune many aspects of mRNA, from splicing to translation. The ability of m6A to regulate translation distally, away from traditional sites near the nucleus, quickly caught the eye of neuroscientists because of implications for selective protein translation at synapses. Work in the brain has demonstrated how m6A is functionally required for many neuronal functions, but two in particular are covered at length here: The role of m6A in 1) neuron development; and 2) memory formation. The purpose of this review is not to cover all data about m6A in the brain. Instead, this review will focus on connecting mechanisms of m6A function in neuron development, with m6A’s known function in memory formation. We will introduce the concept of “translational priming” and discuss how current data fit into this model, then speculate how m6A-mediated translational priming during memory consolidation can regulate learning and memory locally at the synapse.


1993 ◽  
Vol 03 (02) ◽  
pp. 279-291 ◽  
Author(s):  
B. DOYON ◽  
B. CESSAC ◽  
M. QUOY ◽  
M. SAMUELIDES

The occurrence of chaos in recurrent neural networks is supposed to depend on the architecture and on the synaptic coupling strength. It is studied here for a randomly diluted architecture. We produce a bifurcation parameter independent of the connectivity that allows a sustained activity and the occurrence of chaos when reaching a critical value. Even for weak connectivity and small size, we find numerical results in accordance with the theoretical ones previously established for fully connected infinite sized networks. Moreover the route towards chaos is numerically checked to be a quasiperiodic one, whatever the type of the first bifurcation is. In the discussion, we connect these results to some recent theoretical results about highly diluted networks. Hints are provided for further investigations to elicit the role of chaotic dynamics in the cognitive processes of the brain.


2020 ◽  
Vol 34 (10) ◽  
pp. 13933-13934
Author(s):  
Timothy Tadros ◽  
Giri Krishnan ◽  
Ramyaa Ramyaa ◽  
Maxim Bazhenov

Artificial neural networks (ANNs) are known to suffer from catastrophic forgetting: when learning multiple tasks, they perform well on the most recently learned task while failing to perform on previously learned tasks. In biological networks, sleep is known to play a role in memory consolidation and incremental learning. Motivated by the processes that are known to be involved in sleep generation in biological networks, we developed an algorithm that implements a sleep-like phase in ANNs. In an incremental learning framework, we demonstrate that sleep is able to recover older tasks that were otherwise forgotten. We show that sleep creates unique representations of each class of inputs and neurons that were relevant to previous tasks fire during sleep, simulating replay of previously learned memories.


2019 ◽  
Author(s):  
Ji Won Bang ◽  
Dobromir Rahnev

AbstractPreviously learned information is known to be reactivated during periods of quiet wakefulness and such awake reactivation is considered to be a key mechanism for memory consolidation. We recently demonstrated that feature-specific awake reactivation occurs in early visual cortex immediately after extensive visual training on a novel task. To understand the exact role of awake reactivation, here we investigated whether such reactivation depends specifically on the task novelty. Subjects completed a brief visual task that was either novel or extensively trained on previous days. Replicating our previous results, we found that awake reactivation occurs for the novel task even after a brief learning period. Surprisingly, however, brief exposure to the extensively trained task led to “awake suppression” such that neural activity immediately after the exposure diverged from the pattern for the trained task. Further, subjects who had greater performance improvement showed stronger awake suppression. These results suggest that the brain operates different post-task processing depending on prior visual training.


eLife ◽  
2020 ◽  
Vol 9 ◽  
Author(s):  
Oscar C González ◽  
Yury Sokolov ◽  
Giri P Krishnan ◽  
Jean Erik Delanois ◽  
Maxim Bazhenov

Continual learning remains an unsolved problem in artificial neural networks. The brain has evolved mechanisms to prevent catastrophic forgetting of old knowledge during new training. Building upon data suggesting the importance of sleep in learning and memory, we tested a hypothesis that sleep protects old memories from being forgotten after new learning. In the thalamocortical model, training a new memory interfered with previously learned old memories leading to degradation and forgetting of the old memory traces. Simulating sleep after new learning reversed the damage and enhanced old and new memories. We found that when a new memory competed for previously allocated neuronal/synaptic resources, sleep replay changed the synaptic footprint of the old memory to allow overlapping neuronal populations to store multiple memories. Our study predicts that memory storage is dynamic, and sleep enables continual learning by combining consolidation of new memory traces with reconsolidation of old memory traces to minimize interference.


2021 ◽  
Author(s):  
Xiangbin Teng ◽  
Ru-Yuan Zhang

Complex human behaviors involve perceiving continuous stimuli and planning actions at sequential time points, such as in perceiving/producing speech and music. To guide adaptive behavior, the brain needs to internally anticipate a sequence of prospective moments. How does the brain achieve this sequential temporal anticipation without relying on any external timing cues? To answer this question, we designed a premembering task: we tagged three temporal locations in white noise by asking human listeners to detect a tone presented at one of the temporal locations. We selectively probed the anticipating processes guided by memory in trials with only flat noise using novel modulation analyses. A multiscale anticipating scheme was revealed: the neural power modulation in the delta band encodes noise duration on a supra-second scale; the modulations in the alpha-beta band range mark the tagged temporal locations on a subsecond scale and correlate with tone detection performance. To unveil the functional role of those neural observations, we turned to recurrent neural networks (RNNs) optimized for the behavioral task. The RNN hidden dynamics resembled the neural modulations; further analyses and perturbations on RNNs suggest that the neural power modulations in the alpha/beta band emerged as a result of selectively suppressing irrelevant noise periods and increasing sensitivity to the anticipated temporal locations. Our neural, behavioral, and modelling findings convergingly demonstrate that the sequential temporal anticipation involves a process of dynamic gain control: to anticipate a few meaningful moments is also to actively ignore irrelevant events that happen most of the time.


Author(s):  
Jay Schulkin

Chapter 7 speaks about how, while CRF is intimately involved in organ development, it is also linked to devolution of function and conditions of danger. CRF expression itself reveals developmental changes particularly in the brain. CRF is linked to diverse forms of learning and timing of events. But CRF may either enhance or degrade learning and memory. CRF tends to enhance salience and visibility, therefore learning and memory consolidation may be enhanced. However, excessive CRF expression begins to compromise these essential capabilities and promotes neural atrophy deterioration. The role of information molecules is to promote survival systems across life cycles. On the adaptive side, CRF promotes change and attention to change; on the nonadaptive side, CRF promotes decreased tissue capability and the acceleration of an aging process in end organ systems, as this chapter will discuss.


2019 ◽  
Vol 3 (2) ◽  
pp. 27-33
Author(s):  
Maria Antonia Velez Tuarez ◽  
Ronald Ivan Zamora Delgado ◽  
Olga Viviana Torres Teran ◽  
Maria Elena Moya Martine

This article provided a brief analysis of an important human organ and the influences it has on personal and formal learning in the educational field. The specific topics that were investigated are the brain and its importance in learning, characteristics of the hemispheres of the brain learning and the contribution of neuroscience in the teaching-learning process. The first topic mentions how the brain influences learning and the role of memory in that process. The next topic focuses on the characteristics and functions performed by the two brain hemispheres. The latest content deals with the contribution of neuroscience in the educational field, here is detailed on how neural networks combined with the environment where the student performs to make learning possible. The descriptive methodology, based on the review of current bibliographic sources, was used. The purpose of this document is to provide the reader with true and up-to-date sources of information on an organ that integrates complex and necessary ideas for the human being.


Sign in / Sign up

Export Citation Format

Share Document