Geometric bounds on iterative approximations for nearly completely decomposable Markov chains

1990 ◽  
Vol 27 (03) ◽  
pp. 521-529 ◽  
Author(s):  
Guy Louchard ◽  
Guy Latouche

We consider a finite Markov chain with nearly-completely decomposable stochastic matrix. We determine bounds for the error, when the stationary probability vector is approximated via a perturbation analysis.

1990 ◽  
Vol 27 (3) ◽  
pp. 521-529 ◽  
Author(s):  
Guy Louchard ◽  
Guy Latouche

We consider a finite Markov chain with nearly-completely decomposable stochastic matrix. We determine bounds for the error, when the stationary probability vector is approximated via a perturbation analysis.


2005 ◽  
Vol 37 (04) ◽  
pp. 1075-1093 ◽  
Author(s):  
Quan-Lin Li ◽  
Yiqiang Q. Zhao

In this paper, we consider the asymptotic behavior of stationary probability vectors of Markov chains of GI/G/1 type. The generating function of the stationary probability vector is explicitly expressed by theR-measure. This expression of the generating function is more convenient for the asymptotic analysis than those in the literature. TheRG-factorization of both the repeating row and the Wiener-Hopf equations for the boundary row are used to provide necessary spectral properties. The stationary probability vector of a Markov chain of GI/G/1 type is shown to be light tailed if the blocks of the repeating row and the blocks of the boundary row are light tailed. We derive two classes of explicit expression for the asymptotic behavior, the geometric tail, and the semigeometric tail, based on the repeating row, the boundary row, or the minimal positive solution of a crucial equation involved in the generating function, and discuss the singularity classes of the stationary probability vector.


2005 ◽  
Vol 37 (4) ◽  
pp. 1075-1093 ◽  
Author(s):  
Quan-Lin Li ◽  
Yiqiang Q. Zhao

In this paper, we consider the asymptotic behavior of stationary probability vectors of Markov chains of GI/G/1 type. The generating function of the stationary probability vector is explicitly expressed by the R-measure. This expression of the generating function is more convenient for the asymptotic analysis than those in the literature. The RG-factorization of both the repeating row and the Wiener-Hopf equations for the boundary row are used to provide necessary spectral properties. The stationary probability vector of a Markov chain of GI/G/1 type is shown to be light tailed if the blocks of the repeating row and the blocks of the boundary row are light tailed. We derive two classes of explicit expression for the asymptotic behavior, the geometric tail, and the semigeometric tail, based on the repeating row, the boundary row, or the minimal positive solution of a crucial equation involved in the generating function, and discuss the singularity classes of the stationary probability vector.


2013 ◽  
Vol 2013 ◽  
pp. 1-8 ◽  
Author(s):  
Bing-Yuan Pu ◽  
Ting-Zhu Huang ◽  
Chun Wen ◽  
Yi-Qin Lin

An accelerated multilevel aggregation method is presented for calculating the stationary probability vector of an irreducible stochastic matrix in PageRank computation, where the vector extrapolation method is its accelerator. We show how to periodically combine the extrapolation method together with the multilevel aggregation method on the finest level for speeding up the PageRank computation. Detailed numerical results are given to illustrate the behavior of this method, and comparisons with the typical methods are also made.


2013 ◽  
Vol 2013 ◽  
pp. 1-7 ◽  
Author(s):  
Bing-Yuan Pu ◽  
Ting-Zhu Huang ◽  
Chun Wen

This paper presents a class of new accelerated restarted GMRES method for calculating the stationary probability vector of an irreducible Markov chain. We focus on the mechanism of this new hybrid method by showing how to periodically combine the GMRES and vector extrapolation method into a much efficient one for improving the convergence rate in Markov chain problems. Numerical experiments are carried out to demonstrate the efficiency of our new algorithm on several typical Markov chain problems.


1990 ◽  
Vol 4 (1) ◽  
pp. 89-116 ◽  
Author(s):  
Ushlo Sumita ◽  
Maria Rieders

A novel algorithm is developed which computes the ergodic probability vector for large Markov chains. Decomposing the state space into lumps, the algorithm generates a replacement process on each lump, where any exit from a lump is instantaneously replaced at some state in that lump. The replacement distributions are constructed recursively in such a way that, in the limit, the ergodic probability vector for a replacement process on one lump will be proportional to the ergodic probability vector of the original Markov chain restricted to that lump. Inverse matrices computed in the algorithm are of size (M – 1), where M is the number of lumps, thereby providing a substantial rank reduction. When a special structure is present, the procedure for generating the replacement distributions can be simplified. The relevance of the new algorithm to the aggregation-disaggregation algorithm of Takahashi [29] is also discussed.


2005 ◽  
Vol 37 (02) ◽  
pp. 482-509 ◽  
Author(s):  
Quan-Lin Li ◽  
Yiqiang Q. Zhao

In this paper, we provide a novel approach to studying the heavy-tailed asymptotics of the stationary probability vector of a Markov chain of GI/G/1 type, whose transition matrix is constructed from two matrix sequences referred to as a boundary matrix sequence and a repeating matrix sequence, respectively. We first provide a necessary and sufficient condition under which the stationary probability vector is heavy tailed. Then we derive the long-tailed asymptotics of the R-measure in terms of the RG-factorization of the repeating matrix sequence, and a Wiener-Hopf equation for the boundary matrix sequence. Based on this, we are able to provide a detailed analysis of the subexponential asymptotics of the stationary probability vector.


2018 ◽  
Vol 50 (2) ◽  
pp. 645-669 ◽  
Author(s):  
Yuanyuan Liu ◽  
Wendi Li

AbstractLetPbe the transition matrix of a positive recurrent Markov chain on the integers with invariant probability vectorπT, and let(n)P̃ be a stochastic matrix, formed by augmenting the entries of the (n+ 1) x (n+ 1) northwest corner truncation ofParbitrarily, with invariant probability vector(n)πT. We derive computableV-norm bounds on the error betweenπTand(n)πTin terms of the perturbation method from three different aspects: the Poisson equation, the residual matrix, and the norm ergodicity coefficient, which we prove to be effective by showing that they converge to 0 asntends to ∞ under suitable conditions. We illustrate our results through several examples. Comparing our error bounds with the ones of Tweedie (1998), we see that our bounds are more applicable and accurate. Moreover, we also consider possible extensions of our results to continuous-time Markov chains.


Sign in / Sign up

Export Citation Format

Share Document