A note on filtered Markov renewal processes

1981 ◽  
Vol 18 (03) ◽  
pp. 752-756
Author(s):  
Per Kragh Andersen

A Markov renewal theorem necessary for the derivation of the moment formulas for a filtered Markov renewal process stated by Marcus (1974) is proved and its applications are outlined.

1981 ◽  
Vol 18 (3) ◽  
pp. 752-756 ◽  
Author(s):  
Per Kragh Andersen

A Markov renewal theorem necessary for the derivation of the moment formulas for a filtered Markov renewal process stated by Marcus (1974) is proved and its applications are outlined.


Mathematics ◽  
2020 ◽  
Vol 9 (1) ◽  
pp. 55
Author(s):  
P.-C.G. Vassiliou

For a G-inhomogeneous semi-Markov chain and G-inhomogeneous Markov renewal processes, we study the change from real probability measure into a forward probability measure. We find the values of risky bonds using the forward probabilities that the bond will not default up to maturity time for both processes. It is established in the form of a theorem that the forward probability measure does not alter the semi Markov structure. In addition, foundation of a G-inhohomogeneous Markov renewal process is done and a theorem is provided where it is proved that the Markov renewal process is maintained under the forward probability measure. We show that for an inhomogeneous semi-Markov there are martingales that characterize it. We show that the same is true for a Markov renewal processes. We discuss in depth the calibration of the G-inhomogeneous semi-Markov chain model and propose an algorithm for it. We conclude with an application for risky bonds.


2007 ◽  
Vol 44 (02) ◽  
pp. 366-378
Author(s):  
Steven P. Clark ◽  
Peter C. Kiessler

For a Markov renewal process where the time parameter is discrete, we present a novel method for calculating the asymptotic variance. Our approach is based on the key renewal theorem and is applicable even when the state space of the Markov chain is countably infinite.


2007 ◽  
Vol 44 (2) ◽  
pp. 366-378
Author(s):  
Steven P. Clark ◽  
Peter C. Kiessler

For a Markov renewal process where the time parameter is discrete, we present a novel method for calculating the asymptotic variance. Our approach is based on the key renewal theorem and is applicable even when the state space of the Markov chain is countably infinite.


2007 ◽  
Vol 44 (02) ◽  
pp. 366-378
Author(s):  
Steven P. Clark ◽  
Peter C. Kiessler

For a Markov renewal process where the time parameter is discrete, we present a novel method for calculating the asymptotic variance. Our approach is based on the key renewal theorem and is applicable even when the state space of the Markov chain is countably infinite.


1969 ◽  
Vol 18 (2) ◽  
pp. 61-72 ◽  
Author(s):  
A.M. Kshirsagar ◽  
Y. P. Gupta

The following results are obtained in this paper: (1) The probability generating function of the simultaneous distribution of all the Nj( t)'s, where Nj( t) represents the number of times the j­th state ( j = 1, 2, ... , m) is visited in time t, in a Markov Renewal Process ; (2) the covariance between Nj( t) and Nk( t); (3) the probability generating function and moments of Nj( t)'s in a General Markov Renewal Process i.e., a Markov Renewal Process with a random origin; (4) Cumulative processes associated with a Markov Renewal Process along with its first passage time and (5) Equilibrium Markov Renewal Processes.


1999 ◽  
Vol 36 (4) ◽  
pp. 1045-1057 ◽  
Author(s):  
Yiqiang Q. Zhao ◽  
Wei Li ◽  
Attahiru Sule Alfa

In this paper, we consider a certain class of Markov renewal processes where the matrix of the transition kernel governing the Markov renewal process possesses some block-structured property, including repeating rows. Duality conditions and properties are obtained on two probabilistic measures which often play a key role in the analysis and computations of such a block-structured process. The method used here unifies two different concepts of duality. Applications of duality are also provided, including a characteristic theorem concerning recurrence and transience of a transition matrix with repeating rows and a batch arrival queueing model.


1992 ◽  
Vol 29 (01) ◽  
pp. 116-128 ◽  
Author(s):  
C. Y. Teresa Lam

In this paper, we study the new better than used in expectation (NBUE) and new worse than used in expectation (NWUE) properties of Markov renewal processes. We show that a Markov renewal process belongs to a more general class of stochastic processes encountered in reliability or maintenance applications. We present sufficient conditions such that the first-passage times of these processes are new better than used in expectation. The results are applied to the study of shock and repair models, random repair time processes, inventory, and queueing models.


1968 ◽  
Vol 5 (2) ◽  
pp. 387-400 ◽  
Author(s):  
Jozef L. Teugels

In [3], Kendall proved a solidarity theorem for irreducible denumerable discrete time Markov chains. Vere-Jones refined Kendall's theorem by obtaining uniform estimates [14], while Kingman proved analogous results for an irreducible continuous time Markov chain [4], [5].We derive similar solidarity theorems for an irreducible Markov renewal process. The transient case is discussed in Section 3, and Section 4 deals with the positive recurrent case. Recently Cheong also proved solidarity theorems for Semi-Markov processes [1]. His theorems use the Markovian structure, while our emphasis is on the renewal aspects of Markov renewal processes.An application to the M/G/1 queue is included in the last section.


2005 ◽  
Vol 42 (04) ◽  
pp. 1031-1043 ◽  
Author(s):  
Frank Ball ◽  
Robin K. Milne

A simple, widely applicable method is described for determining factorial moments of N̂ t , the number of occurrences in (0,t] of some event defined in terms of an underlying Markov renewal process, and asymptotic expressions for these moments as t → ∞. The factorial moment formulae combine to yield an expression for the probability generating function of N̂ t , and thereby further properties of such counts. The method is developed by considering counting processes associated with events that are determined by the states at two successive renewals of a Markov renewal process, for which it both simplifies and generalises existing results. More explicit results are given in the case of an underlying continuous-time Markov chain. The method is used to provide novel, probabilistically illuminating solutions to some problems arising in the stochastic modelling of ion channels.


Sign in / Sign up

Export Citation Format

Share Document