Bivariate distributions as saddle points of mutual information

1978 ◽  
Vol 15 (03) ◽  
pp. 523-530
Author(s):  
Geung Ho Kim ◽  
H. T. David

Fix a bivariate distribution F on X × Y, considered as a pair (α, {Fx }), where α is a marginal distribution on X and {Fx } is a collection of conditional distributions on Y. For essentially every (β,{Gx }) satisfying a certain pair of moment conditions determined by (α, {F x }), J(β, {F x }) ≦ J(α, {F x }) ≦ J(α, {G x }), where J is mutual information. This relates to two sorts of extremizations of mutual information of relevance to communication theory and statistics.

1978 ◽  
Vol 15 (3) ◽  
pp. 523-530
Author(s):  
Geung Ho Kim ◽  
H. T. David

Fix a bivariate distribution F on X × Y, considered as a pair (α, {Fx}), where α is a marginal distribution on X and {Fx} is a collection of conditional distributions on Y. For essentially every (β,{Gx}) satisfying a certain pair of moment conditions determined by (α, {Fx}), J(β, {Fx}) ≦ J(α, {Fx}) ≦ J(α, {Gx}), where J is mutual information. This relates to two sorts of extremizations of mutual information of relevance to communication theory and statistics.


Author(s):  
Charles K. Amponsah ◽  
Tomasz J. Kozubowski ◽  
Anna K. Panorska

AbstractWe propose a new stochastic model describing the joint distribution of (X,N), where N is a counting variable while X is the sum of N independent gamma random variables. We present the main properties of this general model, which include marginal and conditional distributions, integral transforms, moments and parameter estimation. We also discuss in more detail a special case where N has a heavy tailed discrete Pareto distribution. An example from finance illustrates the modeling potential of this new mixed bivariate distribution.


2017 ◽  
Vol 825 ◽  
pp. 704-742 ◽  
Author(s):  
Jose M. Pozo ◽  
Arjan J. Geers ◽  
Maria-Cruz Villa-Uriol ◽  
Alejandro F. Frangi

Flow complexity is related to a number of phenomena in science and engineering and has been approached from the perspective of chaotic dynamical systems, ergodic processes or mixing of fluids, just to name a few. To the best of our knowledge, all existing methods to quantify flow complexity are only valid for infinite time evolution, for closed systems or for mixing of two substances. We introduce an index of flow complexity coined interlacing complexity index (ICI), valid for a single-phase flow in an open system with inlet and outlet regions, involving finite times. ICI is based on Shannon’s mutual information (MI), and inspired by an analogy between inlet–outlet open flow systems and communication systems in communication theory. The roles of transmitter, receiver and communication channel are played, respectively, by the inlet, the outlet and the flow transport between them. A perfectly laminar flow in a straight tube can be compared to an ideal communication channel where the transmitted and received messages are identical and hence the MI between input and output is maximal. For more complex flows, generated by more intricate conditions or geometries, the ability to discriminate the outlet position by knowing the inlet position is decreased, reducing the corresponding MI. The behaviour of the ICI has been tested with numerical experiments on diverse flows cases. The results indicate that the ICI provides a sensitive complexity measure with intuitive interpretation in a diversity of conditions and in agreement with other observations, such as Dean vortices and subjective visual assessments. As a crucial component of the ICI formulation, we also introduce the natural distribution of streamlines and the natural distribution of world-lines, with invariance properties with respect to the cross-section used to parameterize them, valid for any type of mass-preserving flow.


2013 ◽  
Vol 27 (2) ◽  
pp. 261-275 ◽  
Author(s):  
Ramesh C. Gupta ◽  
S.N.U.A. Kirmani ◽  
N. Balakrishnan

We consider here a general class of bivariate distributions from reliability point of view, and refer to it as generalized Marshall–Olkin bivariate distributions. This class includes as special cases the Marshall–Olkin bivariate exponential distribution and the class of bivariate distributions studied recently by Sarhan and Balakrishnan [25]. For this class, the reliability, survival, hazard, and mean residual life functions are all derived, and their monotonicity is discussed for the marginal as well as the conditional distributions. These functions are also studied for the series and parallel systems based on this bivariate distribution. Finally, the Clayton association measure for this bivariate model is derived in terms of the hazard gradient.


Author(s):  
Muhammad Qaiser Shahbaz ◽  
Jumanah Ahmed Darwish ◽  
Lutfiah Ismail Al Turk

The bivariate distributions are useful in simultaneous modeling of two random variables. These distributions provide a way of modeling complex joint phenomenon. In this article, a new bivariate distribution is proposed which is known as the bivariate transmuted Burr (BTB) distribution. This new bivariate distribution is extension of the univariate transmuted Burr (TB) distribution to two variables. The proposed  BTB distribution is explored in detail and the marginal and conditional distributions for the distribution are obtained. Joint and conditional moments alongside hazard rate functions are obtained. The maximum likelihood estimation (MLE) for the parameters of the BTB distribution is also done. Finally, real data application of the BTB distribution is given. It is observed that the proposed BTB distribution is a suitable fit for the data used.


Measurement ◽  
2009 ◽  
Vol 42 (7) ◽  
pp. 1118-1121
Author(s):  
M. Rybokas ◽  
K.T.V. Grattan ◽  
V. Giniotis

Author(s):  
Fred Dretske

The mathematical theory of information (also called communication theory) defines a quantity called mutual information that exists between a source, s, and receiver, r. Mutual information is a statistical construct, a quantity defined in terms of conditional probabilities between the events occurring at r and s. If what happens at r depends on what happens at s to some degree, then there is a communication ‘channel’ between r and s, and mutual information at r about s. If, on the other hand, the events at two points are statistically independent, there is zero mutual information. Philosophers and psychologists are attracted to information theory because of its potential as a useful tool in describing an organism’s cognitive relations to the world. The attractions are especially great for those who seek a naturalistic account of knowledge, an account that avoids normative – and, therefore, scientifically unusable – ideas such as rational warrant, sufficient reason and adequate justification. According to this approach, philosophically problematic notions like evidence, knowledge, recognition and perception – perhaps even meaning – can be understood in communication terms. Perceptual knowledge, for instance, might best be rendered in terms of a brain (r) receiving mutual information about a worldly source (s) via sensory channels. When incoming signals carry appropriate information, suitably equipped brains ‘decode’ these signals, extract information and thereby come to know what is happening in the outside world. Perception becomes information-produced belief.


Sign in / Sign up

Export Citation Format

Share Document