Recurrent Network Models, Reservoir Computing

Author(s):  
Robert Legenstein
Neuron ◽  
2016 ◽  
Vol 90 (1) ◽  
pp. 128-142 ◽  
Author(s):  
Kanaka Rajan ◽  
Christopher D. Harvey ◽  
David W. Tank

2010 ◽  
Vol 22 (5) ◽  
pp. 1272-1311 ◽  
Author(s):  
Lars Büsing ◽  
Benjamin Schrauwen ◽  
Robert Legenstein

Reservoir computing (RC) systems are powerful models for online computations on input sequences. They consist of a memoryless readout neuron that is trained on top of a randomly connected recurrent neural network. RC systems are commonly used in two flavors: with analog or binary (spiking) neurons in the recurrent circuits. Previous work indicated a fundamental difference in the behavior of these two implementations of the RC idea. The performance of an RC system built from binary neurons seems to depend strongly on the network connectivity structure. In networks of analog neurons, such clear dependency has not been observed. In this letter, we address this apparent dichotomy by investigating the influence of the network connectivity (parameterized by the neuron in-degree) on a family of network models that interpolates between analog and binary networks. Our analyses are based on a novel estimation of the Lyapunov exponent of the network dynamics with the help of branching process theory, rank measures that estimate the kernel quality and generalization capabilities of recurrent networks, and a novel mean field predictor for computational performance. These analyses reveal that the phase transition between ordered and chaotic network behavior of binary circuits qualitatively differs from the one in analog circuits, leading to differences in the integration of information over short and long timescales. This explains the decreased computational performance observed in binary circuits that are densely connected. The mean field predictor is also used to bound the memory function of recurrent circuits of binary neurons.


Author(s):  
Teijiro Isokawa ◽  
Nobuyuki Matsui ◽  
Haruhiko Nishimura

Quaternions are a class of hypercomplex number systems, a four-dimensional extension of imaginary numbers, which are extensively used in various fields such as modern physics and computer graphics. Although the number of applications of neural networks employing quaternions is comparatively less than that of complex-valued neural networks, it has been increasing recently. In this chapter, the authors describe two types of quaternionic neural network models. One type is a multilayer perceptron based on 3D geometrical affine transformations by quaternions. The operations that can be performed in this network are translation, dilatation, and spatial rotation in three-dimensional space. Several examples are provided in order to demonstrate the utility of this network. The other type is a Hopfield-type recurrent network whose parameters are directly encoded into quaternions. The stability of this network is demonstrated by proving that the energy decreases monotonically with respect to the change in neuron states. The fundamental properties of this network are presented through the network with three neurons.


Author(s):  
RAYMOND S. T. LEE ◽  
JAMES N. K. LIU

Financial prediction is one of the most typical applications in contemporary scientific study. In this paper, we present a fully integrated stock prediction system – NORN Predictor – a Neural Oscillatory-based Recurrent Network for finance prediction system to provide both a) long-term trend prediction, and b) short-term stock price prediction. One of the major characteristics of the proposed system is the automation of the conventional financial technical analysis technique such as market pattern analysis via the NOEGM (Neural Oscillatory-based Elastic Graph Matching) model and its integration with the Time-difference recurrent neural network models. This will provide a fully integrated and automated tool for analysis and investigation of stock investment. From the implementation point of view, the stock pricing information of 33 major Hong Kong stocks in the period from 1990 to 1999 is being adopted for system training and evaluation. As compared with the contemporary neural prediction model, the proposed system has achieved challenging results in terms of efficiency and accuracy.


Nanophotonics ◽  
2017 ◽  
Vol 6 (3) ◽  
pp. 561-576 ◽  
Author(s):  
Guy Van der Sande ◽  
Daniel Brunner ◽  
Miguel C. Soriano

AbstractWe review a novel paradigm that has emerged in analogue neuromorphic optical computing. The goal is to implement a reservoir computer in optics, where information is encoded in the intensity and phase of the optical field. Reservoir computing is a bio-inspired approach especially suited for processing time-dependent information. The reservoir’s complex and high-dimensional transient response to the input signal is capable of universal computation. The reservoir does not need to be trained, which makes it very well suited for optics. As such, much of the promise of photonic reservoirs lies in their minimal hardware requirements, a tremendous advantage over other hardware-intensive neural network models. We review the two main approaches to optical reservoir computing: networks implemented with multiple discrete optical nodes and the continuous system of a single nonlinear device coupled to delayed feedback.


2018 ◽  
Vol 8 (11) ◽  
pp. 2018 ◽  
Author(s):  
Setu Shah ◽  
Zina Ben Miled ◽  
Rebecca Schaefer ◽  
Steve Berube

Predicting water demands is becoming increasingly critical because of the scarcity of this natural resource. In fact, the subject was the focus of numerous studies by a large number of researchers around the world. Several models have been proposed that are able to predict water demands using both statistical and machine learning techniques. These models have successfully identified features that can impact water demand trends for rural and metropolitan areas. However, while the above models, including recurrent network models proposed by the authors are able to predict normal water demands, most have difficulty estimating potential deviations from the norms. Outliers in water demand can be due to various reasons including high temperatures and voluntary or mandatory consumption restrictions by the water utility companies. Estimating these deviations is necessary, especially for water utility companies with a small service footprint, in order to efficiently plan water distribution. This paper proposes a differential learning model that can help model both over-consumption and under-consumption. The proposed differential model builds on a previously proposed recurrent neural network model that was successfully used to predict water demand in central Indiana.


Sign in / Sign up

Export Citation Format

Share Document