scholarly journals Mutual Information and Topology 1: Asymmetric Neural Network

Author(s):  
David Dominguez ◽  
Kostadin Koroutchev ◽  
Eduardo Serrano ◽  
Francisco B. Rodríguez
2007 ◽  
Vol 19 (4) ◽  
pp. 956-973 ◽  
Author(s):  
D. Dominguez ◽  
K. Koroutchev ◽  
E. Serrano ◽  
F. B. Rodríguez

A wide range of networks, including those with small-world topology, can be modeled by the connectivity ratio and randomness of the links. Both learning and attractor abilities of a neural network can be measured by the mutual information (MI) as a function of the load and the overlap between patterns and retrieval states. In this letter, we use MI to search for the optimal topology with regard to the storage and attractor properties of the network in an Amari-Hopfield model. We find that while an optimal storage implies an extremely diluted topology, a large basin of attraction leads to moderate levels of connectivity. This optimal topology is related to the clustering and path length of the network. We also build a diagram for the dynamical phases with random or local initial overlap and show that very diluted networks lose their attractor ability.


Entropy ◽  
2021 ◽  
Vol 23 (6) ◽  
pp. 711
Author(s):  
Mina Basirat ◽  
Bernhard C. Geiger ◽  
Peter M. Roth

Information plane analysis, describing the mutual information between the input and a hidden layer and between a hidden layer and the target over time, has recently been proposed to analyze the training of neural networks. Since the activations of a hidden layer are typically continuous-valued, this mutual information cannot be computed analytically and must thus be estimated, resulting in apparently inconsistent or even contradicting results in the literature. The goal of this paper is to demonstrate how information plane analysis can still be a valuable tool for analyzing neural network training. To this end, we complement the prevailing binning estimator for mutual information with a geometric interpretation. With this geometric interpretation in mind, we evaluate the impact of regularization and interpret phenomena such as underfitting and overfitting. In addition, we investigate neural network learning in the presence of noisy data and noisy labels.


2020 ◽  
Author(s):  
Huihui Dai

<p>The formation of runoff is extremely complicated, and it is not good enough to forecast the future runoff only by using the previous runoff or meteorological data. In order to improve the forecast precision of the medium and long-term runoff forecast model, a set of forecast factor group is selected from meteorological factors, such as rainfall, temperature, air pressure and the circulation factors released by the National Meteorological Center  using the method of mutual information and principal component analysis respectively. Results of the forecast in the Qujiang Catchment suggest the climatic factor-based BP neural network hydrological forecasting model has a better forecasting effect using the mutual information method than using the principal component analysis method.</p>


2010 ◽  
Vol 40-41 ◽  
pp. 930-936 ◽  
Author(s):  
Cong Gui Yuan ◽  
Xin Zheng Zhang ◽  
Shu Qiong Xu

A nonlinear correlative time series prediction method is presented in this paper.It is based on the mutual information of time series and orthogonal polynomial basis neural network. Inputs of network are selected by mutual information, and orthogonal polynomial basis is used as active function.The network is trained by an error iterative learning algorithm.This proposed method for nonlinear time series is tested using two well known time series prediction problems:Gas furnace data time series and Mackey-Glass time series.


Sign in / Sign up

Export Citation Format

Share Document