A marginal characterization of entropy functions for conditional mutually independent random variables (with application to Wyner's common information)

Author(s):  
Qi Chen ◽  
Fan Cheng ◽  
Tie Liu ◽  
Raymond W. Yeung
1983 ◽  
Vol 20 (01) ◽  
pp. 202-208 ◽  
Author(s):  
George Kimeldorf ◽  
Peter F. Thall

It has been recently proved that if N, X 1, X 2, … are non-constant mutually independent random variables with X 1,X 2, … identically distributed and N non-negative and integer-valued, then the independence of and implies that X 1 is Bernoulli and N is Poisson. A well-known theorem in point process theory due to Fichtner characterizes a Poisson process in terms of a sum of independent thinnings. In the present article, simultaneous generalizations of both of these results are provided, including a joint characterization of the multinomial distribution and the Poisson process.


1983 ◽  
Vol 20 (1) ◽  
pp. 202-208 ◽  
Author(s):  
George Kimeldorf ◽  
Peter F. Thall

It has been recently proved that if N, X1, X2, … are non-constant mutually independent random variables with X1,X2, … identically distributed and N non-negative and integer-valued, then the independence of and implies that X1 is Bernoulli and N is Poisson. A well-known theorem in point process theory due to Fichtner characterizes a Poisson process in terms of a sum of independent thinnings. In the present article, simultaneous generalizations of both of these results are provided, including a joint characterization of the multinomial distribution and the Poisson process.


1972 ◽  
Vol 9 (3) ◽  
pp. 681-683
Author(s):  
Leon Podkaminer

The probabilities of the occurrence of n events in a certain time period are calculated under the assumptions that the time intervals between the neighbouring events are mutually independent random variables, satisfying some analytic conditions.


1967 ◽  
Vol 4 (2) ◽  
pp. 402-405 ◽  
Author(s):  
H. D. Miller

Let X(t) be the position at time t of a particle undergoing a simple symmetrical random walk in continuous time, i.e. the particle starts at the origin at time t = 0 and at times T1, T1 + T2, … it undergoes jumps ξ1, ξ2, …, where the time intervals T1, T2, … between successive jumps are mutually independent random variables each following the exponential density e–t while the jumps, which are independent of the τi, are mutually independent random variables with the distribution . The process X(t) is clearly a Markov process whose state space is the set of all integers.


1981 ◽  
Vol 18 (3) ◽  
pp. 652-659 ◽  
Author(s):  
M. J. Phillips

The negative exponential distribution is characterized in terms of two independent random variables. Only one of the random variables has a negative exponential distribution whilst the other can belong to a wide class of distributions. This result is then applied to two models for the reliability of a system of two modules subject to revealed and unrevealed faults to show when the models are equivalent. It is also shown, under certain conditions, that the system availability is only independent of the distribution of revealed failure times in one module when unrevealed failure times in the other module have a negative exponential distribution.


1981 ◽  
Vol 18 (03) ◽  
pp. 652-659 ◽  
Author(s):  
M. J. Phillips

The negative exponential distribution is characterized in terms of two independent random variables. Only one of the random variables has a negative exponential distribution whilst the other can belong to a wide class of distributions. This result is then applied to two models for the reliability of a system of two modules subject to revealed and unrevealed faults to show when the models are equivalent. It is also shown, under certain conditions, that the system availability is only independent of the distribution of revealed failure times in one module when unrevealed failure times in the other module have a negative exponential distribution.


1972 ◽  
Vol 9 (03) ◽  
pp. 681-683
Author(s):  
Leon Podkaminer

The probabilities of the occurrence of n events in a certain time period are calculated under the assumptions that the time intervals between the neighbouring events are mutually independent random variables, satisfying some analytic conditions.


1967 ◽  
Vol 4 (1) ◽  
pp. 123-129 ◽  
Author(s):  
C. B. Mehr

Distributions of some random variables have been characterized by independence of certain functions of these random variables. For example, let X and Y be two independent and identically distributed random variables having the gamma distribution. Laha showed that U = X + Y and V = X | Y are also independent random variables. Lukacs showed that U and V are independently distributed if, and only if, X and Y have the gamma distribution. Ferguson characterized the exponential distribution in terms of the independence of X – Y and min (X, Y). The best-known of these characterizations is that first proved by Kac which states that if random variables X and Y are independent, then X + Y and X – Y are independent if, and only if, X and Y are jointly Gaussian with the same variance. In this paper, Kac's hypotheses have been somewhat modified. In so doing, we obtain a larger class of distributions which we shall call class λ1. A subclass λ0 of λ1 enjoys many nice properties of the Gaussian distribution, in particular, in non-linear filtering.


1981 ◽  
Vol 18 (1) ◽  
pp. 316-320 ◽  
Author(s):  
George Kimeldorf ◽  
Detlef Plachky ◽  
Allan R. Sampson

Let N, X1, X2, · ·· be non-constant independent random variables with X1, X2, · ·· being identically distributed and N being non-negative and integer-valued. It is shown that the independence of and implies that the Xi's have a Bernoulli distribution and N has a Poisson distribution. Other related characterization results are considered.


Sign in / Sign up

Export Citation Format

Share Document