Markovian couplings staying in arbitrary subsets of the state space

2002 ◽  
Vol 39 (1) ◽  
pp. 197-212 ◽  
Author(s):  
F. Javier López ◽  
Gerardo Sanz

Let (Xt) and (Yt) be continuous-time Markov chains with countable state spaces E and F and let K be an arbitrary subset of E x F. We give necessary and sufficient conditions on the transition rates of (Xt) and (Yt) for the existence of a coupling which stays in K. We also show that when such a coupling exists, it can be chosen to be Markovian and give a way to construct it. In the case E=F and K ⊆ E x E, we see how the problem of construction of the coupling can be simplified. We give some examples of use and application of our results, including a new concept of lumpability in Markov chains.

2002 ◽  
Vol 39 (01) ◽  
pp. 197-212 ◽  
Author(s):  
F. Javier López ◽  
Gerardo Sanz

Let (X t ) and (Y t ) be continuous-time Markov chains with countable state spaces E and F and let K be an arbitrary subset of E x F. We give necessary and sufficient conditions on the transition rates of (X t ) and (Y t ) for the existence of a coupling which stays in K. We also show that when such a coupling exists, it can be chosen to be Markovian and give a way to construct it. In the case E=F and K ⊆ E x E, we see how the problem of construction of the coupling can be simplified. We give some examples of use and application of our results, including a new concept of lumpability in Markov chains.


1989 ◽  
Vol 26 (3) ◽  
pp. 643-648 ◽  
Author(s):  
A. I. Zeifman

We consider a non-homogeneous continuous-time Markov chain X(t) with countable state space. Definitions of uniform and strong quasi-ergodicity are introduced. The forward Kolmogorov system for X(t) is considered as a differential equation in the space of sequences l1. Sufficient conditions for uniform quasi-ergodicity are deduced from this equation. We consider conditions of uniform and strong ergodicity in the case of proportional intensities.


1993 ◽  
Vol 7 (4) ◽  
pp. 529-543 ◽  
Author(s):  
P. K. Pollett ◽  
P. G. Taylor

We consider the problem of establishing the existence of stationary distributions for continuous-time Markov chains directly from the transition rates Q. Given an invariant probability distribution m for Q, we show that a necessary and sufficient condition for m to be a stationary distribution for the minimal process is that Q be regular. We provide sufficient conditions for the regularity of Q that are simple to verify in practice, thus allowing one to easily identify stationary distributions for a variety of models. To illustrate our results, we shall consider three classes of multidimensional Markov chains, namely, networks of queues with batch movements, semireversible queues, and partially balanced Markov processes.


1989 ◽  
Vol 26 (03) ◽  
pp. 643-648 ◽  
Author(s):  
A. I. Zeifman

We consider a non-homogeneous continuous-time Markov chain X(t) with countable state space. Definitions of uniform and strong quasi-ergodicity are introduced. The forward Kolmogorov system for X(t) is considered as a differential equation in the space of sequences l 1 . Sufficient conditions for uniform quasi-ergodicity are deduced from this equation. We consider conditions of uniform and strong ergodicity in the case of proportional intensities.


1993 ◽  
Vol 30 (3) ◽  
pp. 518-528 ◽  
Author(s):  
Frank Ball ◽  
Geoffrey F. Yeo

We consider lumpability for continuous-time Markov chains and provide a simple probabilistic proof of necessary and sufficient conditions for strong lumpability, valid in circumstances not covered by known theory. We also consider the following marginalisability problem. Let {X{t)} = {(X1(t), X2(t), · ··, Xm(t))} be a continuous-time Markov chain. Under what conditions are the marginal processes {X1(t)}, {X2(t)}, · ··, {Xm(t)} also continuous-time Markov chains? We show that this is related to lumpability and, if no two of the marginal processes can jump simultaneously, then they are continuous-time Markov chains if and only if they are mutually independent. Applications to ion channel modelling and birth–death processes are discussed briefly.


Mathematics ◽  
2020 ◽  
Vol 8 (2) ◽  
pp. 253 ◽  
Author(s):  
Alexander Zeifman ◽  
Victor Korolev ◽  
Yacov Satin

This paper is largely a review. It considers two main methods used to study stability and to obtain appropriate quantitative estimates of perturbations of (inhomogeneous) Markov chains with continuous time and a finite or countable state space. An approach is described to the construction of perturbation estimates for the main five classes of such chains associated with queuing models. Several specific models are considered for which the limit characteristics and perturbation bounds for admissible “perturbed” processes are calculated.


2000 ◽  
Vol 32 (4) ◽  
pp. 1064-1076 ◽  
Author(s):  
F. Javier López ◽  
Servet Martínez ◽  
Gerardo Sanz

For continuous-time Markov chains with semigroups P, P' taking values in a partially ordered set, such that P ≤ stP', we show the existence of an order-preserving Markovian coupling and give a way to construct it. From our proof, we also obtain the conditions of Brandt and Last for stochastic domination in terms of the associated intensity matrices. Our result is applied to get necessary and sufficient conditions for the existence of Markovian couplings between two Jackson networks.


2002 ◽  
Vol 39 (4) ◽  
pp. 901-904 ◽  
Author(s):  
P. K. Pollett ◽  
V. T. Stefanov

This note presents a method of evaluating the distribution of a path integral for Markov chains on a countable state space.


1991 ◽  
Vol 23 (02) ◽  
pp. 277-292 ◽  
Author(s):  
P. K. Pollett

The problem of determining invariant measures for continuous-time Markov chains directly from their transition rates is considered. The major result provides necessary and sufficient conditions for the existence of a unique ‘single-exit' chain with a specified invariant measure. This generalizes a result of Hou Chen-Ting and Chen Mufa that deals with symmetrically reversible chains. A simple sufficient condition for the existence of a unique honest chain for which the specified measure is invariant is also presented.


2002 ◽  
Vol 39 (04) ◽  
pp. 901-904 ◽  
Author(s):  
P. K. Pollett ◽  
V. T. Stefanov

This note presents a method of evaluating the distribution of a path integral for Markov chains on a countable state space.


Sign in / Sign up

Export Citation Format

Share Document