scholarly journals Separation of the maxima in samples of geometric random variables

2011 ◽  
Vol 5 (2) ◽  
pp. 271-282 ◽  
Author(s):  
Charlotte Brennan ◽  
Arnold Knopfmacher ◽  
Toufik Mansour ◽  
Stephan Wagner

We consider samples of n geometric random variables W1 W2 ... Wn where P{W) = i} = pqi-l, for 1 ? j ? n, with p + q = 1. For each fixed integer d > 0, we study the probability that the distance between the consecutive maxima in these samples is at least d. We derive a probability generating function for such samples and from it we obtain an exact formula for the probability as a double sum. Using Rice's method we obtain asymptotic estimates for these probabilities. As a consequence of these results, we determine the average minimum separation of the maxima, in a sample of n geometric random variables with at least two maxima.

1982 ◽  
Vol 19 (A) ◽  
pp. 321-326 ◽  
Author(s):  
J. Gani

A direct proof of the expression for the limit probability generating function (p.g.f.) of the sum of Markov Bernoulli random variables is outlined. This depends on the larger eigenvalue of the transition probability matrix of their Markov chain.


2007 ◽  
Vol DMTCS Proceedings vol. AH,... (Proceedings) ◽  
Author(s):  
Margaret Archibald ◽  
Arnold Knopfmacher

International audience We consider samples of n geometric random variables $(Γ _1, Γ _2, \dots Γ _n)$ where $\mathbb{P}\{Γ _j=i\}=pq^{i-1}$, for $1≤j ≤n$, with $p+q=1$. The parameter we study is the position of the first occurrence of the maximum value in a such a sample. We derive a probability generating function for this position with which we compute the first two (factorial) moments. The asymptotic technique known as Rice's method then yields the main terms as well as the Fourier expansions of the fluctuating functions arising in the expected value and the variance.


1979 ◽  
Vol 16 (03) ◽  
pp. 513-525 ◽  
Author(s):  
Andrew D. Barbour ◽  
H.-J. Schuh

It is well known that, in a Bienaymé-Galton–Watson process (Zn ) with 1 < m = EZ 1 < ∞ and EZ 1 log Z 1 <∞, the sequence of random variables Znm –n converges a.s. to a non–degenerate limit. When m =∞, an analogous result holds: for any 0< α < 1, it is possible to find functions U such that α n U (Zn ) converges a.s. to a non-degenerate limit. In this paper, some sufficient conditions, expressed in terms of the probability generating function of Z 1 and of its distribution function, are given under which a particular pair (α, U) is appropriate for (Zn ). The most stringent set of conditions reduces, when U (x) x, to the requirements EZ 1 = 1/α, EZ 1 log Z 1 <∞.


1982 ◽  
Vol 19 (A) ◽  
pp. 321-326 ◽  
Author(s):  
J. Gani

A direct proof of the expression for the limit probability generating function (p.g.f.) of the sum of Markov Bernoulli random variables is outlined. This depends on the larger eigenvalue of the transition probability matrix of their Markov chain.


1998 ◽  
Vol 12 (3) ◽  
pp. 321-323
Author(s):  
Mitsushi Tamaki

We explicitly give the probability mass function and the probability generating function of the first k-record index for a sequence of independent and identically distributed random variables that take on a finite set of possible values. We also compute its factorial moments.


1979 ◽  
Vol 16 (3) ◽  
pp. 513-525 ◽  
Author(s):  
Andrew D. Barbour ◽  
H.-J. Schuh

It is well known that, in a Bienaymé-Galton–Watson process (Zn) with 1 < m = EZ1 < ∞ and EZ1 log Z1 <∞, the sequence of random variables Znm –n converges a.s. to a non–degenerate limit. When m =∞, an analogous result holds: for any 0< α < 1, it is possible to find functions U such that α n U (Zn) converges a.s. to a non-degenerate limit. In this paper, some sufficient conditions, expressed in terms of the probability generating function of Z1 and of its distribution function, are given under which a particular pair (α, U) is appropriate for (Zn). The most stringent set of conditions reduces, when U (x) x, to the requirements EZ1 = 1/α, EZ1 log Z1 <∞.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Nikolaos Halidias

Abstract In this note we study the probability and the mean time for absorption for discrete time Markov chains. In particular, we are interested in estimating the mean time for absorption when absorption is not certain and connect it with some other known results. Computing a suitable probability generating function, we are able to estimate the mean time for absorption when absorption is not certain giving some applications concerning the random walk. Furthermore, we investigate the probability for a Markov chain to reach a set A before reach B generalizing this result for a sequence of sets A 1 , A 2 , … , A k {A_{1},A_{2},\dots,A_{k}} .


Mathematics ◽  
2021 ◽  
Vol 9 (8) ◽  
pp. 868
Author(s):  
Khrystyna Prysyazhnyk ◽  
Iryna Bazylevych ◽  
Ludmila Mitkova ◽  
Iryna Ivanochko

The homogeneous branching process with migration and continuous time is considered. We investigated the distribution of the period-life τ, i.e., the length of the time interval between the moment when the process is initiated by a positive number of particles and the moment when there are no individuals in the population for the first time. The probability generating function of the random process, which describes the behavior of the process within the period-life, was obtained. The boundary theorem for the period-life of the subcritical or critical branching process with migration was found.


1975 ◽  
Vol 12 (3) ◽  
pp. 507-514 ◽  
Author(s):  
Henry Braun

The problem of approximating an arbitrary probability generating function (p.g.f.) by a polynomial is considered. It is shown that if the coefficients rj are chosen so that LN(·) agrees with g(·) to k derivatives at s = 1 and to (N – k) derivatives at s = 0, then LN is in fact an upper or lower bound to g; the nature of the bound depends only on k and not on N. Application of the results to the problems of finding bounds for extinction probabilities, extinction time distributions and moments of branching process distributions are examined.


Sign in / Sign up

Export Citation Format

Share Document