Inhomogeneous Markov chains with periodic matrices of transition probabilities and their application to simulation of meteorological processes

Author(s):  
N. A. Kargapolova ◽  
V. A. Ogorodnikov
2004 ◽  
Vol 2004 (8) ◽  
pp. 421-429 ◽  
Author(s):  
Souad Assoudou ◽  
Belkheir Essebbar

This note is concerned with Bayesian estimation of the transition probabilities of a binary Markov chain observed from heterogeneous individuals. The model is founded on the Jeffreys' prior which allows for transition probabilities to be correlated. The Bayesian estimator is approximated by means of Monte Carlo Markov chain (MCMC) techniques. The performance of the Bayesian estimates is illustrated by analyzing a small simulated data set.


Author(s):  
Peter L. Chesson

AbstractRandom transition probability matrices with stationary independent factors define “white noise” environment processes for Markov chains. Two examples are considered in detail. Such environment processes can be used to construct several Markov chains which are dependent, have the same transition probabilities and are jointly a Markov chain. Transition rates for such processes are evaluated. These results have application to the study of animal movements.


1968 ◽  
Vol 5 (2) ◽  
pp. 401-413 ◽  
Author(s):  
Paul J. Schweitzer

A perturbation formalism is presented which shows how the stationary distribution and fundamental matrix of a Markov chain containing a single irreducible set of states change as the transition probabilities vary. Expressions are given for the partial derivatives of the stationary distribution and fundamental matrix with respect to the transition probabilities. Semi-group properties of the generators of transformations from one Markov chain to another are investigated. It is shown that a perturbation formalism exists in the multiple subchain case if and only if the change in the transition probabilities does not alter the number of, or intermix the various subchains. The formalism is presented when this condition is satisfied.


1977 ◽  
Vol 14 (02) ◽  
pp. 298-308 ◽  
Author(s):  
Peter R. Nelson

In a single-shelf library having infinitely many books B 1 , B 2 , …, the probability of selecting each book is assumed known. Books are removed one at a time and replaced in position k prior to the next removal. Books are moved either to the right or the left as is necessary to vacate position k. Those arrangements of books where after some finite position all the books are in natural order (book i occupies position i) are considered as states in an infinite Markov chain. When k > 1, we show that the chain can never be positive recurrent. When k = 1, we find the limits of ratios of one-step transition probabilities; and when k = 1 and the chain is transient, we find the Martin exit boundary.


1987 ◽  
Vol 1 (3) ◽  
pp. 251-264 ◽  
Author(s):  
Sheldon M. Ross

In this paper we propose a new approach for estimating the transition probabilities and mean occupation times of continuous-time Markov chains. Our approach is to approximate the probability of being in a state (or the mean time already spent in a state) at time t by the probability of being in that state (or the mean time already spent in that state) at a random time that is gamma distributed with mean t.


2008 ◽  
Vol 45 (03) ◽  
pp. 640-649
Author(s):  
Victor de la Peña ◽  
Henryk Gzyl ◽  
Patrick McDonald

Let W n be a simple Markov chain on the integers. Suppose that X n is a simple Markov chain on the integers whose transition probabilities coincide with those of W n off a finite set. We prove that there is an M > 0 such that the Markov chain W n and the joint distributions of the first hitting time and first hitting place of X n started at the origin for the sets {-M, M} and {-(M + 1), (M + 1)} algorithmically determine the transition probabilities of X n .


1975 ◽  
Vol 12 (04) ◽  
pp. 744-752 ◽  
Author(s):  
Richard L. Tweedie

In many Markov chain models, the immediate characteristic of importance is the positive recurrence of the chain. In this note we investigate whether positivity, and also recurrence, are robust properties of Markov chains when the transition laws are perturbed. The chains we consider are on a fairly general state space : when specialised to a countable space, our results are essentially that, if the transition matrices of two irreducible chains coincide on all but a finite number of columns, then positivity of one implies positivity of both; whilst if they coincide on all but a finite number of rows and columns, recurrence of one implies recurrence of both. Examples are given to show that these results (and their general analogues) cannot in general be strengthened.


Sign in / Sign up

Export Citation Format

Share Document