On determining absorption probabilities for Markov chains in random environments

1981 ◽  
Vol 13 (2) ◽  
pp. 369-387 ◽  
Author(s):  
Richard D. Bourgin ◽  
Robert Cogburn

The general framework of a Markov chain in a random environment is presented and the problem of determining extinction probabilities is discussed. An efficient method for determining absorption probabilities and criteria for certain absorption are presented in the case that the environmental process is a two-state Markov chain. These results are then applied to birth and death, queueing and branching chains in random environments.

1981 ◽  
Vol 13 (02) ◽  
pp. 369-387 ◽  
Author(s):  
Richard D. Bourgin ◽  
Robert Cogburn

The general framework of a Markov chain in a random environment is presented and the problem of determining extinction probabilities is discussed. An efficient method for determining absorption probabilities and criteria for certain absorption are presented in the case that the environmental process is a two-state Markov chain. These results are then applied to birth and death, queueing and branching chains in random environments.


2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Nikolaos Halidias

Abstract In this note we study the probability and the mean time for absorption for discrete time Markov chains. In particular, we are interested in estimating the mean time for absorption when absorption is not certain and connect it with some other known results. Computing a suitable probability generating function, we are able to estimate the mean time for absorption when absorption is not certain giving some applications concerning the random walk. Furthermore, we investigate the probability for a Markov chain to reach a set A before reach B generalizing this result for a sequence of sets A 1 , A 2 , … , A k {A_{1},A_{2},\dots,A_{k}} .


1981 ◽  
Vol 18 (01) ◽  
pp. 19-30 ◽  
Author(s):  
Robert Cogburn ◽  
William C. Torrez

A generalization to continuous time is given for a discrete-time model of a birth and death process in a random environment. Some important properties of this process in the continuous-time setting are stated and proved including instability and extinction conditions, and when suitable absorbing barriers have been defined, methods are given for the calculation of extinction probabilities and the expected duration of the process.


1973 ◽  
Vol 10 (03) ◽  
pp. 659-665
Author(s):  
Donald C. Raffety

R-positivity theory for Markov chains is used to obtain results for random environment branching processes whose environment random variables are independent and identically distributed and whose environmental extinction probabilities are equal. For certain processes whose eventual extinction is almost sure, it is shown that the distribution of population size conditioned by non-extinction at time n tends to a left eigenvector of the transition matrix. Limiting values of other conditional probabilities are given in terms of this left eigenvector and it is shown that the probability of non-extinction at time n approaches zero geometrically as n approaches ∞. Analogous results are obtained for processes whose extinction is not almost sure.


1973 ◽  
Vol 10 (3) ◽  
pp. 659-665 ◽  
Author(s):  
Donald C. Raffety

R-positivity theory for Markov chains is used to obtain results for random environment branching processes whose environment random variables are independent and identically distributed and whose environmental extinction probabilities are equal. For certain processes whose eventual extinction is almost sure, it is shown that the distribution of population size conditioned by non-extinction at time n tends to a left eigenvector of the transition matrix. Limiting values of other conditional probabilities are given in terms of this left eigenvector and it is shown that the probability of non-extinction at time n approaches zero geometrically as n approaches ∞. Analogous results are obtained for processes whose extinction is not almost sure.


1996 ◽  
Vol 33 (02) ◽  
pp. 357-367 ◽  
Author(s):  
M. V. Koutras

In this paper we consider a class of reliability structures which can be efficiently described through (imbedded in) finite Markov chains. Some general results are provided for the reliability evaluation and generating functions of such systems. Finally, it is shown that a great variety of well known reliability structures can be accommodated in this general framework, and certain properties of those structures are obtained on using their Markov chain imbedding description.


1996 ◽  
Vol 33 (2) ◽  
pp. 357-367 ◽  
Author(s):  
M. V. Koutras

In this paper we consider a class of reliability structures which can be efficiently described through (imbedded in) finite Markov chains. Some general results are provided for the reliability evaluation and generating functions of such systems. Finally, it is shown that a great variety of well known reliability structures can be accommodated in this general framework, and certain properties of those structures are obtained on using their Markov chain imbedding description.


1981 ◽  
Vol 18 (1) ◽  
pp. 19-30 ◽  
Author(s):  
Robert Cogburn ◽  
William C. Torrez

A generalization to continuous time is given for a discrete-time model of a birth and death process in a random environment. Some important properties of this process in the continuous-time setting are stated and proved including instability and extinction conditions, and when suitable absorbing barriers have been defined, methods are given for the calculation of extinction probabilities and the expected duration of the process.


2018 ◽  
Vol 33 (4) ◽  
pp. 528-563
Author(s):  
Joel Ratsaby

AbstractThe general problem under investigation is to understand how the complexity of a system which has been adapted to its random environment affects the level of randomness of its output (which is a function of its random input). In this paper, we consider a specific instance of this problem in which a deterministic finite-state decision system operates in a random environment that is modeled by a binary Markov chain. The system interacts with it by trying to match states of inactivity (represented by 0). Matching means that the system selects the (t + 1)th bit from the Markov chain whenever it predicts at time t that the environment will take a 0 value. The actual value at time t + 1 may be 0 or 1 thus the selected sequence of bits (which forms the system's output) may have both binary values. To try to predict well, the system's decision function is inferred based on a sample of the random environment.We are interested in assessing how non-random the output sequence may be. To do that, we apply the adapted system on a second random sample of the environment and derive an upper bound on the deviation between the average number of 1 bit in the output sequence and the probability of a 1. The bound shows that the complexity of the system has a direct effect on this deviation and hence on how non-random the output sequence may be. The bound takes the form of $O(\sqrt {(2^k/n} ))$ where 2k is the complexity of the system and n is the length of the second sample.


1987 ◽  
Vol 24 (01) ◽  
pp. 25-34 ◽  
Author(s):  
Richard Cornez

A generalization of a birth and death chain in a random environment (Y n , Z n ) is developed allowing for feedback to the environmental process (Y n ). The resulting process is then known as a birth and death chain in a random environment with feedback. Sufficient conditions are found under which the (Z n ) process goes extinct almost surely or has strictly positive probability of non-extinction.


Sign in / Sign up

Export Citation Format

Share Document