scholarly journals Generation of scale-invariant sequential activity in linear recurrent networks

2019 ◽  
Author(s):  
Yue Liu ◽  
Marc W. Howard

AbstractSequential neural activity has been observed in many parts of the brain and has been proposed as a neural mechanism for memory. The natural world expresses temporal relationships at a wide range of scales. Because we cannot know the relevant scales a priori it is desirable that memory, and thus the generated sequences, are scale-invariant. Although recurrent neural network models have been proposed as a mechanism for generating sequences, the requirements for scale-invariant sequences are not known. This paper reports the constraints that enable a linear recurrent neural network model to generate scale-invariant sequential activity. A straightforward eigendecomposition analysis results in two independent conditions that are required for scaleinvariance for connectivity matrices with real, distinct eigenvalues. First, the eigenvalues of the network must be geometrically spaced. Second, the eigenvectors must be related to one another via translation. These constraints are easily generalizable for matrices that have complex and distinct eigenvalues. Analogous albeit less compact constraints hold for matrices with degenerate eigenvalues. These constraints, along with considerations on initial conditions, provide a general recipe to build linear recurrent neural networks that support scale-invariant sequential activity.


2020 ◽  
Vol 32 (7) ◽  
pp. 1379-1407
Author(s):  
Yue Liu ◽  
Marc W. Howard

Sequential neural activity has been observed in many parts of the brain and has been proposed as a neural mechanism for memory. The natural world expresses temporal relationships at a wide range of scales. Because we cannot know the relevant scales a priori, it is desirable that memory, and thus the generated sequences, is scale invariant. Although recurrent neural network models have been proposed as a mechanism for generating sequences, the requirements for scale-invariant sequences are not known. This letter reports the constraints that enable a linear recurrent neural network model to generate scale-invariant sequential activity. A straightforward eigendecomposition analysis results in two independent conditions that are required for scale invariance for connectivity matrices with real, distinct eigenvalues. First, the eigenvalues of the network must be geometrically spaced. Second, the eigenvectors must be related to one another via translation. These constraints are easily generalizable for matrices that have complex and distinct eigenvalues. Analogous albeit less compact constraints hold for matrices with degenerate eigenvalues. These constraints, along with considerations on initial conditions, provide a general recipe to build linear recurrent neural networks that support scale-invariant sequential activity.



Author(s):  
Alex Warstadt ◽  
Amanpreet Singh ◽  
Samuel R. Bowman

This paper investigates the ability of artificial neural networks to judge the grammatical acceptability of a sentence, with the goal of testing their linguistic competence. We introduce the Corpus of Linguistic Acceptability (CoLA), a set of 10,657 English sentences labeled as grammatical or ungrammatical from published linguistics literature. As baselines, we train several recurrent neural network models on acceptability classification, and find that our models outperform unsupervised models by Lau et al. (2016) on CoLA. Error-analysis on specific grammatical phenomena reveals that both Lau et al.’s models and ours learn systematic generalizations like subject-verb-object order. However, all models we test perform far below human level on a wide range of grammatical constructions.





2014 ◽  
Vol 538 ◽  
pp. 167-170
Author(s):  
Hui Zhong Mao ◽  
Chen Qiao ◽  
Wen Feng Jing ◽  
Xi Chen ◽  
Jin Qin Mao

This paper presents the global convergence theory of the discrete-time uniform pseudo projection anti-monotone network with the quasi–symmetric matrix, which removes the connection matrix constraints. The theory widens the range of applications of the discrete–time uniform pseudo projection anti–monotone network and is valid for many kinds of discrete recurrent neural network models.



2017 ◽  
Author(s):  
Charlie W. Zhao ◽  
Mark J. Daley ◽  
J. Andrew Pruszynski

AbstractFirst-order tactile neurons have spatially complex receptive fields. Here we use machine learning tools to show that such complexity arises for a wide range of training sets and network architectures, and benefits network performance, especially on more difficult tasks and in the presence of noise. Our work suggests that spatially complex receptive fields are normatively good given the biological constraints of the tactile periphery.





2020 ◽  
Vol 31 (3) ◽  
pp. 287-296
Author(s):  
Ahmed A. Moustafa ◽  
Angela Porter ◽  
Ahmed M. Megreya

AbstractMany students suffer from anxiety when performing numerical calculations. Mathematics anxiety is a condition that has a negative effect on educational outcomes and future employment prospects. While there are a multitude of behavioral studies on mathematics anxiety, its underlying cognitive and neural mechanism remain unclear. This article provides a systematic review of cognitive studies that investigated mathematics anxiety. As there are no prior neural network models of mathematics anxiety, this article discusses how previous neural network models of mathematical cognition could be adapted to simulate the neural and behavioral studies of mathematics anxiety. In other words, here we provide a novel integrative network theory on the links between mathematics anxiety, cognition, and brain substrates. This theoretical framework may explain the impact of mathematics anxiety on a range of cognitive and neuropsychological tests. Therefore, it could improve our understanding of the cognitive and neurological mechanisms underlying mathematics anxiety and also has important applications. Indeed, a better understanding of mathematics anxiety could inform more effective therapeutic techniques that in turn could lead to significant improvements in educational outcomes.



Sign in / Sign up

Export Citation Format

Share Document