On a Markov chain approach for the study of reliability structures

1996 ◽  
Vol 33 (2) ◽  
pp. 357-367 ◽  
Author(s):  
M. V. Koutras

In this paper we consider a class of reliability structures which can be efficiently described through (imbedded in) finite Markov chains. Some general results are provided for the reliability evaluation and generating functions of such systems. Finally, it is shown that a great variety of well known reliability structures can be accommodated in this general framework, and certain properties of those structures are obtained on using their Markov chain imbedding description.

1996 ◽  
Vol 33 (02) ◽  
pp. 357-367 ◽  
Author(s):  
M. V. Koutras

In this paper we consider a class of reliability structures which can be efficiently described through (imbedded in) finite Markov chains. Some general results are provided for the reliability evaluation and generating functions of such systems. Finally, it is shown that a great variety of well known reliability structures can be accommodated in this general framework, and certain properties of those structures are obtained on using their Markov chain imbedding description.


1968 ◽  
Vol 5 (2) ◽  
pp. 414-426 ◽  
Author(s):  
J. N. Darroch ◽  
K. W. Morris

Let S denote a subset of the states of a finite continuous-time Markov chain and let Y(a) denote the time that elapses until a weighted sum of a time units have been spent in S. Formulae are derived for the generating functions of Y(a) and of Y(a + b) – Y(b).


1968 ◽  
Vol 5 (02) ◽  
pp. 414-426 ◽  
Author(s):  
J. N. Darroch ◽  
K. W. Morris

Let S denote a subset of the states of a finite continuous-time Markov chain and let Y(a) denote the time that elapses until a weighted sum of a time units have been spent in S. Formulae are derived for the generating functions of Y(a) and of Y(a + b) – Y(b).


1967 ◽  
Vol 4 (03) ◽  
pp. 496-507 ◽  
Author(s):  
J. N. Darroch ◽  
K. W. Morris

Let T denote a subset of the possible transitions between the states of a finite Markov chain and let Yk denote the time of the kth occurrence of a T-transition. Formulae are derived for the generating functions of Yk , of Yj + k — Yj and of Yj + k — Yj in the limit as j → ∞, for both discrete-time and continuoustime chains. Several particular cases are briefly discussed.


1967 ◽  
Vol 4 (3) ◽  
pp. 496-507 ◽  
Author(s):  
J. N. Darroch ◽  
K. W. Morris

Let T denote a subset of the possible transitions between the states of a finite Markov chain and let Yk denote the time of the kth occurrence of a T-transition. Formulae are derived for the generating functions of Yk, of Yj + k — Yj and of Yj + k — Yj in the limit as j → ∞, for both discrete-time and continuoustime chains. Several particular cases are briefly discussed.


1981 ◽  
Vol 13 (2) ◽  
pp. 369-387 ◽  
Author(s):  
Richard D. Bourgin ◽  
Robert Cogburn

The general framework of a Markov chain in a random environment is presented and the problem of determining extinction probabilities is discussed. An efficient method for determining absorption probabilities and criteria for certain absorption are presented in the case that the environmental process is a two-state Markov chain. These results are then applied to birth and death, queueing and branching chains in random environments.


1968 ◽  
Vol 5 (2) ◽  
pp. 401-413 ◽  
Author(s):  
Paul J. Schweitzer

A perturbation formalism is presented which shows how the stationary distribution and fundamental matrix of a Markov chain containing a single irreducible set of states change as the transition probabilities vary. Expressions are given for the partial derivatives of the stationary distribution and fundamental matrix with respect to the transition probabilities. Semi-group properties of the generators of transformations from one Markov chain to another are investigated. It is shown that a perturbation formalism exists in the multiple subchain case if and only if the change in the transition probabilities does not alter the number of, or intermix the various subchains. The formalism is presented when this condition is satisfied.


1994 ◽  
Vol 31 (1) ◽  
pp. 59-75 ◽  
Author(s):  
Peter Buchholz

Exact and ordinary lumpability in finite Markov chains is considered. Both concepts naturally define an aggregation of the Markov chain yielding an aggregated chain that allows the exact determination of several stationary and transient results for the original chain. We show which quantities can be determined without an error from the aggregated process and describe methods to calculate bounds on the remaining results. Furthermore, the concept of lumpability is extended to near lumpability yielding approximative aggregation.


1984 ◽  
Vol 21 (03) ◽  
pp. 567-574 ◽  
Author(s):  
Atef M. Abdel-Moneim ◽  
Frederick W. Leysieffer

Conditions under which a function of a finite, discrete-time Markov chain, X(t), is again Markov are given, when X(t) is not irreducible. These conditions are given in terms of an interrelationship between two partitions of the state space of X(t), the partition induced by the minimal essential classes of X(t) and the partition with respect to which lumping is to be considered.


2019 ◽  
Vol 44 (3) ◽  
pp. 282-308 ◽  
Author(s):  
Brian G. Vegetabile ◽  
Stephanie A. Stout-Oswald ◽  
Elysia Poggi Davis ◽  
Tallie Z. Baram ◽  
Hal S. Stern

Predictability of behavior is an important characteristic in many fields including biology, medicine, marketing, and education. When a sequence of actions performed by an individual can be modeled as a stationary time-homogeneous Markov chain the predictability of the individual’s behavior can be quantified by the entropy rate of the process. This article compares three estimators of the entropy rate of finite Markov processes. The first two methods directly estimate the entropy rate through estimates of the transition matrix and stationary distribution of the process. The third method is related to the sliding-window Lempel–Ziv compression algorithm. The methods are compared via a simulation study and in the context of a study of interactions between mothers and their children.


Sign in / Sign up

Export Citation Format

Share Document