scholarly journals Coding and robustness of signal processing in streaming recurrent neural networks

Author(s):  
Vasiliy Osipov ◽  
Viktor Nikiforov

Introduction: When substantiating promising architectures of streaming recurrent neural networks, it becomes necessary to assess their stability in processing various input signals. For this, stability diagrams are constructed containing the results of simulation for each of the nodes of these diagrams. Such an estimation can be time-consuming and computationally intensive, especially when analyzing large neural networks. Purpose: Search for methods of quick construction of such diagrams and assessing the stability of streaming recurrent neural networks. Results: Analysis of the features of the stability diagrams under study showed that the nodes of the diagrams are grouped into continuous zones with the same ratio characteristics of the input signal processing defects. With this in mind, the article proposes a method for constructing these diagrams based on bypassing the boundaries of their zones. With this approach, you do not have to perform simulation for the interior nodes of each zone. The simulation should be performed only for the nodes adjacent to zone boundaries. Due to this, the number of nodes for which you need to perform simulation sessions is reduced by an order of magnitude. The influence of the input signal coding types on the streaming recurrent neural network stability has been investigated. It is shown that the representation of input signals in the form of sequences of single pulses with intersecting elements can provide greater stability as compared to pulses without any intersection.

2021 ◽  
pp. 1-43
Author(s):  
Alfred Rajakumar ◽  
John Rinzel ◽  
Zhe S. Chen

Abstract Recurrent neural networks (RNNs) have been widely used to model sequential neural dynamics (“neural sequences”) of cortical circuits in cognitive and motor tasks. Efforts to incorporate biological constraints and Dale's principle will help elucidate the neural representations and mechanisms of underlying circuits. We trained an excitatory-inhibitory RNN to learn neural sequences in a supervised manner and studied the representations and dynamic attractors of the trained network. The trained RNN was robust to trigger the sequence in response to various input signals and interpolated a time-warped input for sequence representation. Interestingly, a learned sequence can repeat periodically when the RNN evolved beyond the duration of a single sequence. The eigenspectrum of the learned recurrent connectivity matrix with growing or damping modes, together with the RNN's nonlinearity, were adequate to generate a limit cycle attractor. We further examined the stability of dynamic attractors while training the RNN to learn two sequences. Together, our results provide a general framework for understanding neural sequence representation in the excitatory-inhibitory RNN.


2014 ◽  
Vol 2014 ◽  
pp. 1-7 ◽  
Author(s):  
Lei Ding ◽  
Hong-Bing Zeng ◽  
Wei Wang ◽  
Fei Yu

This paper investigates the stability of static recurrent neural networks (SRNNs) with a time-varying delay. Based on the complete delay-decomposing approach and quadratic separation framework, a novel Lyapunov-Krasovskii functional is constructed. By employing a reciprocally convex technique to consider the relationship between the time-varying delay and its varying interval, some improved delay-dependent stability conditions are presented in terms of linear matrix inequalities (LMIs). Finally, a numerical example is provided to show the merits and the effectiveness of the proposed methods.


2020 ◽  
Author(s):  
Laércio Oliveira Junior ◽  
Florian Stelzer ◽  
Liang Zhao

Echo State Networks (ESNs) are recurrent neural networks that map an input signal to a high-dimensional dynamical system, called reservoir, and possess adaptive output weights. The output weights are trained such that the ESN’s output signal fits the desired target signal. Classical reservoirs are sparse and randomly connected networks. In this article, we investigate the effect of different network topologies on the performance of ESNs. Specifically, we use two types of networks to construct clustered reservoirs of ESN: the clustered Erdös–Rényi and the clustered Barabási-Albert network model. Moreover, we compare the performance of these clustered ESNs (CESNs) and classical ESNs with the random reservoir by employing them to two different tasks: frequency filtering and the reconstruction of chaotic signals. By using a clustered topology, one can achieve a significant increase in the ESN’s performance.


1999 ◽  
Vol 09 (02) ◽  
pp. 95-98 ◽  
Author(s):  
ANKE MEYER-BÄSE

This paper is concerned with the asymptotic hyperstability of recurrent neural networks. We derive based on the stability results necessary and sufficient conditions for the network parameters. The results we achieve are more general than those based on Lyapunov methods, since they provide milder constraints on the connection weights than the conventional results and do not suppose symmetry of the weights.


Sign in / Sign up

Export Citation Format

Share Document