Rootfinding for Markov Chains with Quasi-Triangular Transition Matrices

1988 ◽  
Author(s):  
Carl M. Harris
1977 ◽  
Vol 9 (04) ◽  
pp. 747-764
Author(s):  
Burton Singer ◽  
Seymour Spilerman

In a wide variety of multi-wave panel studies in economics and sociology, comparisons between the observed transition matrices and predictions of them based on time-homogeneous Markov chains have revealed a special kind of discrepancy: the trace of the observed matrices tends to be larger than the trace of the predicted matrices. A common explanation for this discrepancy has been via mixtures of Markov chains. Specializing to mixtures of Markov semi-groups of the form we exhibit classes of stochastic matrices M, probability measures µ and time intervals Δ such that for k = 2, 3 and 4. These examples contradict the substantial literature which suggests — implicitly — that the above inequality should be reversed for general mixtures of Markov semi-groups.


1975 ◽  
Vol 12 (04) ◽  
pp. 744-752 ◽  
Author(s):  
Richard L. Tweedie

In many Markov chain models, the immediate characteristic of importance is the positive recurrence of the chain. In this note we investigate whether positivity, and also recurrence, are robust properties of Markov chains when the transition laws are perturbed. The chains we consider are on a fairly general state space : when specialised to a countable space, our results are essentially that, if the transition matrices of two irreducible chains coincide on all but a finite number of columns, then positivity of one implies positivity of both; whilst if they coincide on all but a finite number of rows and columns, recurrence of one implies recurrence of both. Examples are given to show that these results (and their general analogues) cannot in general be strengthened.


2008 ◽  
Vol 40 (04) ◽  
pp. 1157-1173
Author(s):  
Winfried K. Grassmann ◽  
Javad Tavakoli

This paper deals with censoring of infinite-state banded Markov chains. Censoring involves reducing the time spent in states outside a certain set of states to 0 without affecting the number of visits within this set. We show that, if all states are transient, there is, besides the standard censored Markov chain, a nonstandard censored Markov chain which is stochastic. Both the stochastic and the substochastic solutions are found by censoring a sequence of finite transition matrices. If all matrices in the sequence are stochastic, the stochastic solution arises in the limit, whereas the substochastic solution arises if the matrices in the sequence are substochastic. We also show that, if the Markov chain is recurrent, the only solution is the stochastic solution. Censoring is particularly fruitful when applied to quasi-birth-and-death (QBD) processes. It turns out that key matrices in such processes are not unique, a fact that has been observed by several authors. We note that the stochastic solution is important for the analysis of finite queues.


1998 ◽  
Vol 30 (2) ◽  
pp. 365-384 ◽  
Author(s):  
Yiqiang Q. Zhao ◽  
Wei Li ◽  
W. John Braun

In this paper, we study Markov chains with infinite state block-structured transition matrices, whose states are partitioned into levels according to the block structure, and various associated measures. Roughly speaking, these measures involve first passage times or expected numbers of visits to certain levels without hitting other levels. They are very important and often play a key role in the study of a Markov chain. Necessary and/or sufficient conditions are obtained for a Markov chain to be positive recurrent, recurrent, or transient in terms of these measures. Results are obtained for general irreducible Markov chains as well as those with transition matrices possessing some block structure. We also discuss the decomposition or the factorization of the characteristic equations of these measures. In the scalar case, we locate the zeros of these characteristic functions and therefore use these zeros to characterize a Markov chain. Examples and various remarks are given to illustrate some of the results.


2007 ◽  
Vol 424 (1) ◽  
pp. 118-131 ◽  
Author(s):  
S.J. Kirkland ◽  
Michael Neumann ◽  
Jianhong Xu

2016 ◽  
Vol 53 (3) ◽  
pp. 946-952
Author(s):  
Loï Hervé ◽  
James Ledoux

AbstractWe analyse the 𝓁²(𝜋)-convergence rate of irreducible and aperiodic Markov chains with N-band transition probability matrix P and with invariant distribution 𝜋. This analysis is heavily based on two steps. First, the study of the essential spectral radius ress(P|𝓁²(𝜋)) of P|𝓁²(𝜋) derived from Hennion’s quasi-compactness criteria. Second, the connection between the spectral gap property (SG2) of P on 𝓁²(𝜋) and the V-geometric ergodicity of P. Specifically, the (SG2) is shown to hold under the condition α0≔∑m=−NNlim supi→+∞(P(i,i+m)P*(i+m,i)1∕2<1. Moreover, ress(P|𝓁²(𝜋)≤α0. Effective bounds on the convergence rate can be provided from a truncation procedure.


1998 ◽  
Vol 30 (3) ◽  
pp. 676-692 ◽  
Author(s):  
Xi-Ren Cao

We derive formulas for the first- and higher-order derivatives of the steady state performance measures for changes in transition matrices of irreducible and aperiodic Markov chains. Using these formulas, we obtain a Maclaurin series for the performance measures of such Markov chains. The convergence range of the Maclaurin series can be determined. We show that the derivatives and the coefficients of the Maclaurin series can be easily estimated by analysing a single sample path of the Markov chain. Algorithms for estimating these quantities are provided. Markov chains consisting of transient states and multiple chains are also studied. The results can be easily extended to Markov processes. The derivation of the results is closely related to some fundamental concepts, such as group inverse, potentials, and realization factors in perturbation analysis. Simulation results are provided to illustrate the accuracy of the single sample path based estimation. Possible applications to engineering problems are discussed.


2014 ◽  
Vol 243 (1-2) ◽  
pp. 19-35 ◽  
Author(s):  
Konstantin Avrachenkov ◽  
Ali Eshragh ◽  
Jerzy A. Filar

2008 ◽  
Vol 40 (4) ◽  
pp. 1157-1173 ◽  
Author(s):  
Winfried K. Grassmann ◽  
Javad Tavakoli

This paper deals with censoring of infinite-state banded Markov chains. Censoring involves reducing the time spent in states outside a certain set of states to 0 without affecting the number of visits within this set. We show that, if all states are transient, there is, besides the standard censored Markov chain, a nonstandard censored Markov chain which is stochastic. Both the stochastic and the substochastic solutions are found by censoring a sequence of finite transition matrices. If all matrices in the sequence are stochastic, the stochastic solution arises in the limit, whereas the substochastic solution arises if the matrices in the sequence are substochastic. We also show that, if the Markov chain is recurrent, the only solution is the stochastic solution. Censoring is particularly fruitful when applied to quasi-birth-and-death (QBD) processes. It turns out that key matrices in such processes are not unique, a fact that has been observed by several authors. We note that the stochastic solution is important for the analysis of finite queues.


Sign in / Sign up

Export Citation Format

Share Document