scholarly journals Training Excitatory-Inhibitory Recurrent Neural Networks for Cognitive Tasks: A Simple and Flexible Framework

2016 ◽  
Vol 12 (2) ◽  
pp. e1004792 ◽  
Author(s):  
H. Francis Song ◽  
Guangyu R. Yang ◽  
Xiao-Jing Wang
iScience ◽  
2021 ◽  
pp. 103178
Author(s):  
Yichen Henry Liu ◽  
Junda Zhu ◽  
Christos Constantinidis ◽  
Xin Zhou

2016 ◽  
Author(s):  
Thomas Miconi

AbstractNeural activity during cognitive tasks exhibits complex dynamics that flexibly encode task-relevant variables. Chaotic recurrent networks, which spontaneously generate rich dynamics, have been proposed as a model of cortical computation during cognitive tasks. However, existing methods for training these networks are either biologically implausible, and/or require a continuous, real-time error signal to guide learning. Here we show that a biologically plausible learning rule can train such recurrent networks, guided solely by delayed, phasic rewards at the end of each trial. Networks endowed with this learning rule can successfully learn nontrivial tasks requiring flexible (context-dependent) associations, memory maintenance, nonlinear mixed selectivities, and coordination among multiple outputs. The resulting networks replicate complex dynamics previously observed in animal cortex, such as dynamic encoding of task features and selective integration of sensory inputs. We conclude that recurrent neural networks offer a plausible model of cortical dynamics during both learning and performance of flexible behavior.


2020 ◽  
Author(s):  
Yichen Henry Liu ◽  
Junda Zhu ◽  
Christos Constantinidis ◽  
Xin Zhou

ABSTRACTWorking memory and response inhibition are functions that mature relatively late in life, after adolescence, paralleling the maturation of the prefrontal cortex. The link between behavioral and neural maturation is not obvious, however, making it challenging to understand how neural activity underlies the maturation of cognitive function. To gain insights into the nature of observed changes in prefrontal activity between adolescence and adulthood, we investigated the progressive changes in unit activity of Recurrent Neural Networks (RNNs) as they were trained to perform working memory and response inhibition tasks. These included increased delay period activity during working memory tasks, and increased activation in antisaccade tasks. These findings reveal universal properties underlying the neuronal computations behind cognitive tasks and explicate the nature of changes that occur as the result of developmental maturation.


eLife ◽  
2017 ◽  
Vol 6 ◽  
Author(s):  
Thomas Miconi

Neural activity during cognitive tasks exhibits complex dynamics that flexibly encode task-relevant variables. Chaotic recurrent networks, which spontaneously generate rich dynamics, have been proposed as a model of cortical computation during cognitive tasks. However, existing methods for training these networks are either biologically implausible, and/or require a continuous, real-time error signal to guide learning. Here we show that a biologically plausible learning rule can train such recurrent networks, guided solely by delayed, phasic rewards at the end of each trial. Networks endowed with this learning rule can successfully learn nontrivial tasks requiring flexible (context-dependent) associations, memory maintenance, nonlinear mixed selectivities, and coordination among multiple outputs. The resulting networks replicate complex dynamics previously observed in animal cortex, such as dynamic encoding of task features and selective integration of sensory inputs. We conclude that recurrent neural networks offer a plausible model of cortical dynamics during both learning and performance of flexible behavior.


Sign in / Sign up

Export Citation Format

Share Document