A tight lower bound on the mutual information of a binary and an arbitrary finite random variable as a function of the variational distance

Author(s):  
Arno G. Stefani ◽  
Johannes B. Huber ◽  
Christophe Jardin ◽  
Heinrich Sticht
2011 ◽  
Vol 23 (7) ◽  
pp. 1862-1898 ◽  
Author(s):  
Nathan D. VanderKraats ◽  
Arunava Banerjee

For any memoryless communication channel with a binary-valued input and a one-dimensional real-valued output, we introduce a probabilistic lower bound on the mutual information given empirical observations on the channel. The bound is built on the Dvoretzky-Kiefer-Wolfowitz inequality and is distribution free. A quadratic time algorithm is described for computing the bound and its corresponding class-conditional distribution functions. We compare our approach to existing techniques and show the superiority of our bound to a method inspired by Fano’s inequality where the continuous random variable is discretized.


2016 ◽  
Vol 2016 ◽  
pp. 1-7 ◽  
Author(s):  
Wenzhou Wang ◽  
Limeng Shi ◽  
Xiaoqian Zhu

The dependencies between different business lines of banks have serious effects on the accuracy of operational risk estimation. Furthermore, the dependencies are far more complicated than simple linear correlation. While Pearson correlation coefficient is constructed based on the hypothesis of a linear association, the mutual information that measures all the information of a random variable contained in another random variable is a powerful alternative. Based on mutual information, the generalized correlation coefficient which can capture both linear and nonlinear correlation can be derived. This paper models the correlation between business lines by mutual information and normal copula. The experiment on a real-world Chinese bank operational risk data set shows that using mutual information to model the dependencies between business lines is more reasonable than linear correlation.


2003 ◽  
Vol 42 (03) ◽  
pp. 260-264 ◽  
Author(s):  
W. A. Benish

Summary Objectives: This paper demonstrates that diagnostic test performance can be quantified as the average amount of information the test result (R) provides about the disease state (D). Methods: A fundamental concept of information theory, mutual information, is directly applicable to this problem. This statistic quantifies the amount of information that one random variable contains about another random variable. Prior to performing a diagnostic test, R and D are random variables. Hence, their mutual information, I(D;R), is the amount of information that R provides about D. Results: I(D;R) is a function of both 1) the pretest probabilities of the disease state and 2) the set of conditional probabilities relating each possible test result to each possible disease state. The area under the receiver operating characteristic curve (AUC) is a popular measure of diagnostic test performance which, in contrast to I(D;R), is independent of the pretest probabilities; it is a function of only the set of conditional probabilities. The AUC is not a measure of diagnostic information. Conclusions: Because I(D;R) is dependent upon pretest probabilities, knowledge of the setting in which a diagnostic test is employed is a necessary condition for quantifying the amount of information it provides. Advantages of I(D;R) over the AUC are that it can be calculated without invoking an arbitrary curve fitting routine, it is applicable to situations in which multiple diagnoses are under consideration, and it quantifies test performance in meaningful units (bits of information).


2010 ◽  
Vol 47 (4) ◽  
pp. 1191-1194 ◽  
Author(s):  
Paweł Hitczenko

We establish an upper bound on the tails of a random variable that arises as a solution of a stochastic difference equation. In the nonnegative case our bound is similar to a lower bound obtained in Goldie and Grübel (1996).


2013 ◽  
Vol 50 (4) ◽  
pp. 909-917
Author(s):  
M. Bondareva

In this paper we discuss a nondecreasing lower bound for the Poisson cumulative distribution function (CDF) at z standard deviations above the mean λ, where z and λ are parameters. This is important because the normal distribution as an approximation for the Poisson CDF may overestimate or underestimate its value. A sharp nondecreasing lower bound in the form of a step function is constructed. As a corollary of the bound's properties, for a given percent α and parameter λ, the minimal z is obtained such that, for any Poisson random variable with the mean greater or equal to λ, its αth percentile is at most z standard deviations above its mean. For Poisson distributed control parameters, the corollary allows simple policies measuring performance in terms of standard deviations from a benchmark.


Author(s):  
Sandra Bender ◽  
Meik Dörpinghaus ◽  
Gerhard P. Fettweis

AbstractWe consider a real continuous-time bandlimited additive white Gaussian noise channel with 1-bit output quantization. On such a channel the information is carried by the temporal distances of the zero-crossings of the transmit signal. We derive an approximate lower bound on the capacity by lower-bounding the mutual information rate for input signals with exponentially distributed zero-crossing distances, sine-shaped transition waveform, and an average power constraint. The focus is on the behavior in the mid-to-high signal-to-noise ratio (SNR) regime above 10 dB. For hard bandlimited channels, the lower bound on the mutual information rate saturates with the SNR growing to infinity. For a given SNR the loss with respect to the unquantized additive white Gaussian noise channel solely depends on the ratio of channel bandwidth and the rate parameter of the exponential distribution. We complement those findings with an approximate upper bound on the mutual information rate for the specific signaling scheme. We show that both bounds are close in the SNR domain of approximately 10–20 dB.


2021 ◽  
pp. 1-11
Author(s):  
Bruce A. McArthur ◽  
Anthony W. Isenor

Abstract This paper examines a new interpretation for spatial mutual information based on the mutual information between an attribute value and a spatial random variable. This new interpretation permits the measurement of variations in spatial mutual information over the domain, not only answering the question of whether a spatial dependency exists and the strength of that dependency, but also allowing the identification of where such dependencies exist. Using simulated and real vessel reporting data, the properties of this new interpretation of spatial mutual information are explored. The utility of the technique in detecting spatial boundaries between regions of data having different statistical properties is examined. The technique is shown to successfully identify vessel traffic boundaries, crossing points between traffic lanes, and transitions between regions having differing vessel movement patterns.


2010 ◽  
Vol 47 (04) ◽  
pp. 1191-1194 ◽  
Author(s):  
Paweł Hitczenko

We establish an upper bound on the tails of a random variable that arises as a solution of a stochastic difference equation. In the nonnegative case our bound is similar to a lower bound obtained in Goldie and Grübel (1996).


Sign in / Sign up

Export Citation Format

Share Document