Lumpability and marginalisability for continuous-time Markov chains

1993 ◽  
Vol 30 (03) ◽  
pp. 518-528 ◽  
Author(s):  
Frank Ball ◽  
Geoffrey F. Yeo

We consider lumpability for continuous-time Markov chains and provide a simple probabilistic proof of necessary and sufficient conditions for strong lumpability, valid in circumstances not covered by known theory. We also consider the following marginalisability problem. Let {X{t)} = {(X 1(t), X 2(t), · ··, Xm (t))} be a continuous-time Markov chain. Under what conditions are the marginal processes {X 1(t)}, {X 2(t)}, · ··, {Xm (t)} also continuous-time Markov chains? We show that this is related to lumpability and, if no two of the marginal processes can jump simultaneously, then they are continuous-time Markov chains if and only if they are mutually independent. Applications to ion channel modelling and birth–death processes are discussed briefly.

1993 ◽  
Vol 30 (3) ◽  
pp. 518-528 ◽  
Author(s):  
Frank Ball ◽  
Geoffrey F. Yeo

We consider lumpability for continuous-time Markov chains and provide a simple probabilistic proof of necessary and sufficient conditions for strong lumpability, valid in circumstances not covered by known theory. We also consider the following marginalisability problem. Let {X{t)} = {(X1(t), X2(t), · ··, Xm(t))} be a continuous-time Markov chain. Under what conditions are the marginal processes {X1(t)}, {X2(t)}, · ··, {Xm(t)} also continuous-time Markov chains? We show that this is related to lumpability and, if no two of the marginal processes can jump simultaneously, then they are continuous-time Markov chains if and only if they are mutually independent. Applications to ion channel modelling and birth–death processes are discussed briefly.


1989 ◽  
Vol 26 (3) ◽  
pp. 643-648 ◽  
Author(s):  
A. I. Zeifman

We consider a non-homogeneous continuous-time Markov chain X(t) with countable state space. Definitions of uniform and strong quasi-ergodicity are introduced. The forward Kolmogorov system for X(t) is considered as a differential equation in the space of sequences l1. Sufficient conditions for uniform quasi-ergodicity are deduced from this equation. We consider conditions of uniform and strong ergodicity in the case of proportional intensities.


1989 ◽  
Vol 26 (03) ◽  
pp. 643-648 ◽  
Author(s):  
A. I. Zeifman

We consider a non-homogeneous continuous-time Markov chain X(t) with countable state space. Definitions of uniform and strong quasi-ergodicity are introduced. The forward Kolmogorov system for X(t) is considered as a differential equation in the space of sequences l 1 . Sufficient conditions for uniform quasi-ergodicity are deduced from this equation. We consider conditions of uniform and strong ergodicity in the case of proportional intensities.


1988 ◽  
Vol 2 (2) ◽  
pp. 267-268
Author(s):  
Sheldon M. Ross

In [1] an approach to approximate the transition probabilities and mean occupation times of a continuous-time Markov chain is presented. For the chain under consideration, let Pij(t) and Tij(t) denote respectively the probability that it is in state j at time t, and the total time spent in j by time t, in both cases conditional on the chain starting in state i. Also, let Y1,…, Yn be independent exponential random variables each with rate λ = n/t, which are also independent of the Markov chain.


2000 ◽  
Vol 32 (4) ◽  
pp. 1064-1076 ◽  
Author(s):  
F. Javier López ◽  
Servet Martínez ◽  
Gerardo Sanz

For continuous-time Markov chains with semigroups P, P' taking values in a partially ordered set, such that P ≤ stP', we show the existence of an order-preserving Markovian coupling and give a way to construct it. From our proof, we also obtain the conditions of Brandt and Last for stochastic domination in terms of the associated intensity matrices. Our result is applied to get necessary and sufficient conditions for the existence of Markovian couplings between two Jackson networks.


2002 ◽  
Vol 39 (01) ◽  
pp. 197-212 ◽  
Author(s):  
F. Javier López ◽  
Gerardo Sanz

Let (X t ) and (Y t ) be continuous-time Markov chains with countable state spaces E and F and let K be an arbitrary subset of E x F. We give necessary and sufficient conditions on the transition rates of (X t ) and (Y t ) for the existence of a coupling which stays in K. We also show that when such a coupling exists, it can be chosen to be Markovian and give a way to construct it. In the case E=F and K ⊆ E x E, we see how the problem of construction of the coupling can be simplified. We give some examples of use and application of our results, including a new concept of lumpability in Markov chains.


1993 ◽  
Vol 7 (4) ◽  
pp. 529-543 ◽  
Author(s):  
P. K. Pollett ◽  
P. G. Taylor

We consider the problem of establishing the existence of stationary distributions for continuous-time Markov chains directly from the transition rates Q. Given an invariant probability distribution m for Q, we show that a necessary and sufficient condition for m to be a stationary distribution for the minimal process is that Q be regular. We provide sufficient conditions for the regularity of Q that are simple to verify in practice, thus allowing one to easily identify stationary distributions for a variety of models. To illustrate our results, we shall consider three classes of multidimensional Markov chains, namely, networks of queues with batch movements, semireversible queues, and partially balanced Markov processes.


2002 ◽  
Vol 39 (1) ◽  
pp. 197-212 ◽  
Author(s):  
F. Javier López ◽  
Gerardo Sanz

Let (Xt) and (Yt) be continuous-time Markov chains with countable state spaces E and F and let K be an arbitrary subset of E x F. We give necessary and sufficient conditions on the transition rates of (Xt) and (Yt) for the existence of a coupling which stays in K. We also show that when such a coupling exists, it can be chosen to be Markovian and give a way to construct it. In the case E=F and K ⊆ E x E, we see how the problem of construction of the coupling can be simplified. We give some examples of use and application of our results, including a new concept of lumpability in Markov chains.


2020 ◽  
Vol 57 (4) ◽  
pp. 1313-1338
Author(s):  
Yuanyuan Liu ◽  
Wendi Li ◽  
Xiuqin Li

AbstractBlock-structured Markov chains model a large variety of queueing problems and have many important applications in various areas. Stability properties have been well investigated for these Markov chains. In this paper we will present transient properties for two specific types of block-structured Markov chains, including M/G/1 type and GI/M/1 type. Necessary and sufficient conditions in terms of system parameters are obtained for geometric transience and algebraic transience. Possible extensions of the results to continuous-time Markov chains are also included.


1994 ◽  
Vol 26 (04) ◽  
pp. 919-946 ◽  
Author(s):  
Frank Ball ◽  
Robin K. Milne ◽  
Geoffrey F. Yeo

We study a bivariate stochastic process {X(t)} = Z(t))}, where {XE (t)} is a continuous-time Markov chain describing the environment and {Z(t)} is the process of interest. In the context which motivated this study, {Z(t)} models the gating behaviour of a single ion channel. It is assumed that given {XE (t)}, the channel process {Z(t)} is a continuous-time Markov chain with infinitesimal generator at time t dependent on XE (t), and that the environment process {XE {t)} is not dependent on {Z(t)}. We derive necessary and sufficient conditions for {X(t)} to be time reversible, showing that then its equilibrium distribution has a product form which reflects independence of the state of the environment and the state of the channel. In the special case when the environment controls the speed of the channel process, we derive transition probabilities and sojourn time distributions for {Z(t)} by exploiting connections with Markov reward processes. Some of these results are extended to a stationary environment. Applications to problems arising in modelling multiple ion channel systems are discussed. In particular, we present ways in which a multichannel model in a random environment does and does not exhibit behaviour identical to a corresponding model based on independent and identically distributed channels.


Sign in / Sign up

Export Citation Format

Share Document