Higher-Order Weak Approximation of Ito Diffusions by Markov Chains

1992 ◽  
Vol 6 (3) ◽  
pp. 391-408 ◽  
Author(s):  
Eckhaard Platen

This paper proposes a method that allows the construction of discrete-state Markov chains approximating an Ito-diffusion process. The transition probabilities of the Markov chains are chosen in such a way that functionals converge with a desired weak order with respect to vanishing step size under sufficient smoothness assumptions.

2012 ◽  
Vol 87 (1) ◽  
pp. 27-36 ◽  
Author(s):  
PETER E. KLOEDEN ◽  
VICTOR S. KOZYAKIN

AbstractContinuous-time discrete-state random Markov chains generated by a random linear differential equation with a random tridiagonal matrix are shown to have a random attractor consisting of singleton subsets, essentially a random path, in the simplex of probability vectors. The proof uses comparison theorems for Carathéodory random differential equations and the fact that the linear cocycle generated by the Markov chain is a uniformly contractive mapping of the positive cone into itself with respect to the Hilbert projective metric. It does not involve probabilistic properties of the sample path and is thus equally valid in the nonautonomous deterministic context of Markov chains with, say, periodically varying transition probabilities, in which case the attractor is a periodic path.


Genetics ◽  
1974 ◽  
Vol 76 (2) ◽  
pp. 367-377
Author(s):  
Takeo Maruyama

ABSTRACT A Markov process (chain) of gene frequency change is derived for a geographically-structured model of a population. The population consists of colonies which are connected by migration. Selection operates in each colony independently. It is shown that there exists a stochastic clock that transforms the originally complicated process of gene frequency change to a random walk which is independent of the geographical structure of the population. The time parameter is a local random time that is dependent on the sample path. In fact, if the alleles are selectively neutral, the time parameter is exactly equal to the sum of the average local genetic variation appearing in the population, and otherwise they are approximately equal. The Kolmogorov forward and backward equations of the process are obtained. As a limit of large population size, a diffusion process is derived. The transition probabilities of the Markov chain and of the diffusion process are obtained explicitly. Certain quantities of biological interest are shown to be independent of the population structure. The quantities are the fixation probability of a mutant, the sum of the average local genetic variation and the variation summed over the generations in which the gene frequency in the whole population assumes a specified value.


2004 ◽  
Vol 2004 (8) ◽  
pp. 421-429 ◽  
Author(s):  
Souad Assoudou ◽  
Belkheir Essebbar

This note is concerned with Bayesian estimation of the transition probabilities of a binary Markov chain observed from heterogeneous individuals. The model is founded on the Jeffreys' prior which allows for transition probabilities to be correlated. The Bayesian estimator is approximated by means of Monte Carlo Markov chain (MCMC) techniques. The performance of the Bayesian estimates is illustrated by analyzing a small simulated data set.


Author(s):  
Peter L. Chesson

AbstractRandom transition probability matrices with stationary independent factors define “white noise” environment processes for Markov chains. Two examples are considered in detail. Such environment processes can be used to construct several Markov chains which are dependent, have the same transition probabilities and are jointly a Markov chain. Transition rates for such processes are evaluated. These results have application to the study of animal movements.


1968 ◽  
Vol 5 (2) ◽  
pp. 401-413 ◽  
Author(s):  
Paul J. Schweitzer

A perturbation formalism is presented which shows how the stationary distribution and fundamental matrix of a Markov chain containing a single irreducible set of states change as the transition probabilities vary. Expressions are given for the partial derivatives of the stationary distribution and fundamental matrix with respect to the transition probabilities. Semi-group properties of the generators of transformations from one Markov chain to another are investigated. It is shown that a perturbation formalism exists in the multiple subchain case if and only if the change in the transition probabilities does not alter the number of, or intermix the various subchains. The formalism is presented when this condition is satisfied.


1976 ◽  
Vol 24 (1) ◽  
pp. 138-144 ◽  
Author(s):  
N J Pressman

Markovian analysis is a method to measure optical texture based on gray-level transition probabilities in digitized images. Experiments are described that investigate that classification performance of parameters generated by Markovian analysis. Results using Markov texture parameters show that the selection of a Markov step size strongly affects classification error rates and the number of parameters required to achieve the maximum correct classification rates. Markov texture parameters are shown to achieve high rates of correct classification in discriminating images of normal from abnormal cervical cell nuclei.


Sign in / Sign up

Export Citation Format

Share Document