How to Generate Ordered Maps by Maximizing the Mutual Information between Input and Output Signals

1989 ◽  
Vol 1 (3) ◽  
pp. 402-411 ◽  
Author(s):  
Ralph Linsker

A learning rule that performs gradient ascent in the average mutual information between input and an output signal is derived for a system having feedforward and lateral interactions. Several processes emerge as components of this learning rule: Hebb-like modification, and cooperation and competition among processing nodes. Topographic map formation is demonstrated using the learning rule. An analytic expression relating the average mutual information to the response properties of nodes and their geometric arrangement is derived in certain cases. This yields a relation between the local map magnification factor and the probability distribution in the input space. The results provide new links between unsupervised learning and information-theoretic optimization in a system whose properties are biologically motivated.

1997 ◽  
Vol 9 (8) ◽  
pp. 1661-1665 ◽  
Author(s):  
Ralph Linsker

This note presents a local learning rule that enables a network to maximize the mutual information between input and output vectors. The network's output units may be nonlinear, and the distribution of input vectors is arbitrary. The local algorithm also serves to compute the inverse C−1 of an arbitrary square connection weight matrix.


1992 ◽  
Vol 4 (5) ◽  
pp. 691-702 ◽  
Author(s):  
Ralph Linsker

A network that develops to maximize the mutual information between its output and the signal portion of its input (which is admixed with noise) is useful for extracting salient input features, and may provide a model for aspects of biological neural network function. I describe a local synaptic Learning rule that performs stochastic gradient ascent in this information-theoretic quantity, for the case in which the input-output mapping is linear and the input signal and noise are multivariate gaussian. Feedforward connection strengths are modified by a Hebbian rule during a "learning" phase in which examples of input signal plus noise are presented to the network, and by an anti-Hebbian rule during an "unlearning" phase in which examples of noise alone are presented. Each recurrent lateral connection has two values of connection strength, one for each phase; these values are updated by an anti-Hebbian rule.


1997 ◽  
Vol 9 (3) ◽  
pp. 595-606 ◽  
Author(s):  
Marc M. Van Hulle

This article introduces an extremely simple and local learning rule for to pographic map formation. The rule, called the maximum entropy learning rule (MER), maximizes the unconditional entropy of the map's output for any type of input distribution. The aim of this article is to show that MER is a viable strategy for building topographic maps that maximize the average mutual information of the output responses to noiseless input signals when only input noise and noise-added input signals are available.


Author(s):  
Nguyen N. Tran ◽  
Ha X. Nguyen

A capacity analysis for generally correlated wireless multi-hop multi-input multi-output (MIMO) channels is presented in this paper. The channel at each hop is spatially correlated, the source symbols are mutually correlated, and the additive Gaussian noises are colored. First, by invoking Karush-Kuhn-Tucker condition for the optimality of convex programming, we derive the optimal source symbol covariance for the maximum mutual information between the channel input and the channel output when having the full knowledge of channel at the transmitter. Secondly, we formulate the average mutual information maximization problem when having only the channel statistics at the transmitter. Since this problem is almost impossible to be solved analytically, the numerical interior-point-method is employed to obtain the optimal solution. Furthermore, to reduce the computational complexity, an asymptotic closed-form solution is derived by maximizing an upper bound of the objective function. Simulation results show that the average mutual information obtained by the asymptotic design is very closed to that obtained by the optimal design, while saving a huge computational complexity.


2020 ◽  
Vol 501 (1) ◽  
pp. 994-1001
Author(s):  
Suman Sarkar ◽  
Biswajit Pandey ◽  
Snehasish Bhattacharjee

ABSTRACT We use an information theoretic framework to analyse data from the Galaxy Zoo 2 project and study if there are any statistically significant correlations between the presence of bars in spiral galaxies and their environment. We measure the mutual information between the barredness of galaxies and their environments in a volume limited sample (Mr ≤ −21) and compare it with the same in data sets where (i) the bar/unbar classifications are randomized and (ii) the spatial distribution of galaxies are shuffled on different length scales. We assess the statistical significance of the differences in the mutual information using a t-test and find that both randomization of morphological classifications and shuffling of spatial distribution do not alter the mutual information in a statistically significant way. The non-zero mutual information between the barredness and environment arises due to the finite and discrete nature of the data set that can be entirely explained by mock Poisson distributions. We also separately compare the cumulative distribution functions of the barred and unbarred galaxies as a function of their local density. Using a Kolmogorov–Smirnov test, we find that the null hypothesis cannot be rejected even at $75{{\ \rm per\ cent}}$ confidence level. Our analysis indicates that environments do not play a significant role in the formation of a bar, which is largely determined by the internal processes of the host galaxy.


2000 ◽  
Author(s):  
Paul B. Deignan ◽  
Peter H. Meckl ◽  
Matthew A. Franchek ◽  
Salim A. Jaliwala ◽  
George G. Zhu

Abstract A methodology for the intelligent, model-independent selection of an appropriate set of input signals for the system identification of an unknown process is demonstrated. In modeling this process, it is shown that the terms of a simple nonlinear polynomial model may also be determined through the analysis of the average mutual information between inputs and the output. Average mutual information can be thought of as a nonlinear correlation coefficient and can be calculated from input/output data alone. The methodology described here is especially applicable to the development of virtual sensors.


2021 ◽  
Vol 2021 (9) ◽  
Author(s):  
Alex May

Abstract We prove a theorem showing that the existence of “private” curves in the bulk of AdS implies two regions of the dual CFT share strong correlations. A private curve is a causal curve which avoids the entanglement wedge of a specified boundary region $$ \mathcal{U} $$ U . The implied correlation is measured by the conditional mutual information $$ I\left({\mathcal{V}}_1:\left.{\mathcal{V}}_2\right|\mathcal{U}\right) $$ I V 1 : V 2 U , which is O(1/GN) when a private causal curve exists. The regions $$ {\mathcal{V}}_1 $$ V 1 and $$ {\mathcal{V}}_2 $$ V 2 are specified by the endpoints of the causal curve and the placement of the region $$ \mathcal{U} $$ U . This gives a causal perspective on the conditional mutual information in AdS/CFT, analogous to the causal perspective on the mutual information given by earlier work on the connected wedge theorem. We give an information theoretic argument for our theorem, along with a bulk geometric proof. In the geometric perspective, the theorem follows from the maximin formula and entanglement wedge nesting. In the information theoretic approach, the theorem follows from resource requirements for sending private messages over a public quantum channel.


Sign in / Sign up

Export Citation Format

Share Document