A Note on Approximating Mean Occupation Times of Continuous-Time Markov Chains

1988 ◽  
Vol 2 (2) ◽  
pp. 267-268
Author(s):  
Sheldon M. Ross

In [1] an approach to approximate the transition probabilities and mean occupation times of a continuous-time Markov chain is presented. For the chain under consideration, let Pij(t) and Tij(t) denote respectively the probability that it is in state j at time t, and the total time spent in j by time t, in both cases conditional on the chain starting in state i. Also, let Y1,…, Yn be independent exponential random variables each with rate λ = n/t, which are also independent of the Markov chain.

1993 ◽  
Vol 30 (3) ◽  
pp. 518-528 ◽  
Author(s):  
Frank Ball ◽  
Geoffrey F. Yeo

We consider lumpability for continuous-time Markov chains and provide a simple probabilistic proof of necessary and sufficient conditions for strong lumpability, valid in circumstances not covered by known theory. We also consider the following marginalisability problem. Let {X{t)} = {(X1(t), X2(t), · ··, Xm(t))} be a continuous-time Markov chain. Under what conditions are the marginal processes {X1(t)}, {X2(t)}, · ··, {Xm(t)} also continuous-time Markov chains? We show that this is related to lumpability and, if no two of the marginal processes can jump simultaneously, then they are continuous-time Markov chains if and only if they are mutually independent. Applications to ion channel modelling and birth–death processes are discussed briefly.


1987 ◽  
Vol 1 (3) ◽  
pp. 251-264 ◽  
Author(s):  
Sheldon M. Ross

In this paper we propose a new approach for estimating the transition probabilities and mean occupation times of continuous-time Markov chains. Our approach is to approximate the probability of being in a state (or the mean time already spent in a state) at time t by the probability of being in that state (or the mean time already spent in that state) at a random time that is gamma distributed with mean t.


1989 ◽  
Vol 26 (3) ◽  
pp. 643-648 ◽  
Author(s):  
A. I. Zeifman

We consider a non-homogeneous continuous-time Markov chain X(t) with countable state space. Definitions of uniform and strong quasi-ergodicity are introduced. The forward Kolmogorov system for X(t) is considered as a differential equation in the space of sequences l1. Sufficient conditions for uniform quasi-ergodicity are deduced from this equation. We consider conditions of uniform and strong ergodicity in the case of proportional intensities.


1988 ◽  
Vol 2 (4) ◽  
pp. 471-474 ◽  
Author(s):  
Nico M. van Dijk

Recently, Ross [1] proposed an elegant method of approximating transition probabilities and mean occupation times in continuous-time Markov chains based upon recursively inspecting the process at exponential times. The method turned out to be amazingly efficient for the examples investigated. However, no formal rough error bound was provided. Any error bound even though robust is of practical interest in engineering (e.g., for determining truncation criteria or setting up an experiment). This note primarily aims to show that by a simple and standard comparison relation a rough error bound of the method is secured. Also, some alternative approximations are inspected.


1989 ◽  
Vol 26 (03) ◽  
pp. 643-648 ◽  
Author(s):  
A. I. Zeifman

We consider a non-homogeneous continuous-time Markov chain X(t) with countable state space. Definitions of uniform and strong quasi-ergodicity are introduced. The forward Kolmogorov system for X(t) is considered as a differential equation in the space of sequences l 1 . Sufficient conditions for uniform quasi-ergodicity are deduced from this equation. We consider conditions of uniform and strong ergodicity in the case of proportional intensities.


1971 ◽  
Vol 8 (02) ◽  
pp. 381-390 ◽  
Author(s):  
P. J. Pedler

Consider first a Markov chain with two ergodic states E 1 and E 2, and discrete time parameter set {0, 1, 2, ···, n}. Define the random variables Z 0, Z 1, Z 2, ···, Zn by then the conditional probabilities for k = 1,2,···, n, are independent of k. Thus the matrix of transition probabilities is


1971 ◽  
Vol 8 (2) ◽  
pp. 381-390 ◽  
Author(s):  
P. J. Pedler

Consider first a Markov chain with two ergodic states E1 and E2, and discrete time parameter set {0, 1, 2, ···, n}. Define the random variables Z0, Z1, Z2, ···, Znby then the conditional probabilities for k = 1,2,···, n, are independent of k. Thus the matrix of transition probabilities is


1993 ◽  
Vol 30 (03) ◽  
pp. 518-528 ◽  
Author(s):  
Frank Ball ◽  
Geoffrey F. Yeo

We consider lumpability for continuous-time Markov chains and provide a simple probabilistic proof of necessary and sufficient conditions for strong lumpability, valid in circumstances not covered by known theory. We also consider the following marginalisability problem. Let {X{t)} = {(X 1(t), X 2(t), · ··, Xm (t))} be a continuous-time Markov chain. Under what conditions are the marginal processes {X 1(t)}, {X 2(t)}, · ··, {Xm (t)} also continuous-time Markov chains? We show that this is related to lumpability and, if no two of the marginal processes can jump simultaneously, then they are continuous-time Markov chains if and only if they are mutually independent. Applications to ion channel modelling and birth–death processes are discussed briefly.


1968 ◽  
Vol 5 (03) ◽  
pp. 669-678 ◽  
Author(s):  
Jozef L. Teugels

A general proposition is proved stating that the exponential ergodicity of a stationary Markov chain is preserved for derived Markov chains as defined by Cohen [2], [3]. An application to a certain type of continuous time Markov chains is included.


2014 ◽  
Vol 51 (A) ◽  
pp. 57-62
Author(s):  
Joe Gani

One of the standard methods for approximating a bivariate continuous-time Markov chain {X(t), Y(t): t ≥ 0}, which proves too difficult to solve in its original form, is to replace one of its variables by its mean, This leads to a simplified stochastic process for the remaining variable which can usually be solved, although the technique is not always optimal. In this note we consider two cases where the method is successful for carrier infections and mutating bacteria, and one case where it is somewhat less so for the SIS epidemics.


Sign in / Sign up

Export Citation Format

Share Document