Explicit Forward Recursive Estimators for Markov Modulated Markov Processes

2012 ◽  
Vol 28 (3) ◽  
pp. 359-387 ◽  
Author(s):  
Y. Ephraim ◽  
B. L. Mark
Author(s):  
W. P. Malcom ◽  
Lakhdar Aggoun ◽  
Mohamed Al-Lawati

In this paper we develop a stochastic model incorporating a double-Markov modulated mean-reversion model. Unlike a price process the basis process X can take positive or negative values. This model is based on an explicit discretisation of the corresponding continuous time dynamics. The new feature in our model is that we suppose the mean reverting level in our dynamics as well as the noise coefficient can change according to the states of some finite-state Markov processes which could be the economy and some other unseen random phenomenon.  


2009 ◽  
Vol 57 (2) ◽  
pp. 463-470 ◽  
Author(s):  
Y. Ephraim ◽  
W.J.J. Roberts

Author(s):  
Atilla Ay ◽  
Refik Soyer ◽  
Joshua Landon ◽  
Süleyman Özekici

Markov processes play an important role in reliability analysis and particularly in modeling the stochastic evolution of survival/failure behavior of systems. The probability law of Markov processes is described by its generator or the transition rate matrix. In this paper, we suppose that the process is doubly stochastic in the sense that the generator is also stochastic. In our model, we suppose that the entries in the generator change with respect to the changing states of yet another Markov process. This process represents the random environment that the stochastic model operates in. In fact, we have a Markov modulated Markov process which can be modeled as a bivariate Markov process that can be analyzed probabilistically using Markovian analysis. In this setting, however, we are interested in Bayesian inference on model parameters. We present a computationally tractable approach using Gibbs sampling and demonstrate it by numerical illustrations. We also discuss cases that involve complete and partial data sets on both processes.


2000 ◽  
Vol 14 (3) ◽  
pp. 299-315 ◽  
Author(s):  
Taizhong Hu ◽  
Xiaoming Pan

Results and conditions which quantify the decrease in dependence with lag for a stationary Markov process and enable one to compare the dependence for two stationary Markov processes are obtained. The notions of dependence used in this article are the supermodular ordering and the concordance ordering. Both discrete-time and continuous-time Markov processes are considered. Some applications of the main results are given. In queueing theory, the monotonicity results of the waiting time of the nth customer as well as the stationary waiting time in an MR/GI/1 queue and the stationary workload in a Markov-modulated queue are established, thus strengthening previous results while simplifying their derivation. This article is a continuation of those by Fang et al. [7] and Hu and Joe [10].


Author(s):  
M. Vidyasagar

This book explores important aspects of Markov and hidden Markov processes and the applications of these ideas to various problems in computational biology. It starts from first principles, so that no previous knowledge of probability is necessary. However, the work is rigorous and mathematical, making it useful to engineers and mathematicians, even those not interested in biological applications. A range of exercises is provided, including drills to familiarize the reader with concepts and more advanced problems that require deep thinking about the theory. Biological applications are taken from post-genomic biology, especially genomics and proteomics. The topics examined include standard material such as the Perron–Frobenius theorem, transient and recurrent states, hitting probabilities and hitting times, maximum likelihood estimation, the Viterbi algorithm, and the Baum–Welch algorithm. The book contains discussions of extremely useful topics not usually seen at the basic level, such as ergodicity of Markov processes, Markov Chain Monte Carlo (MCMC), information theory, and large deviation theory for both i.i.d and Markov processes. It also presents state-of-the-art realization theory for hidden Markov models. Among biological applications, it offers an in-depth look at the BLAST (Basic Local Alignment Search Technique) algorithm, including a comprehensive explanation of the underlying theory. Other applications such as profile hidden Markov models are also explored.


Sign in / Sign up

Export Citation Format

Share Document