scholarly journals Event-Chain Monte Carlo: Foundations, Applications, and Prospects

2021 ◽  
Vol 9 ◽  
Author(s):  
Werner Krauth

This review treats the mathematical and algorithmic foundations of non-reversible Markov chains in the context of event-chain Monte Carlo (ECMC), a continuous-time lifted Markov chain that employs the factorized Metropolis algorithm. It analyzes a number of model applications and then reviews the formulation as well as the performance of ECMC in key models in statistical physics. Finally, the review reports on an ongoing initiative to apply ECMC to the sampling problem in molecular simulation, i.e., to real-world models of peptides, proteins, and polymers in aqueous solution.

1995 ◽  
Vol 9 (2) ◽  
pp. 227-237 ◽  
Author(s):  
Taizhong Hu ◽  
Harry Joe

Let (X1, X2) and (Y1, Y2) be bivariate random vectors with a common marginal distribution (X1, X2) is said to be more positively dependent than (Y1, Y2) if E[h(X1)h(X2)] ≥ E[h(Y1)h(Y2)] for all functions h for which the expectations exist. The purpose of this paper is to study the monotonicity of positive dependence with time for a stationary reversible Markov chain [X1]; that is, (Xs, Xl+s) is less positively dependent as t increases. Both discrete and continuous time and both a denumerable set and a subset of the real line for the state space are considered. Some examples are given to show that the assertions established for reversible Markov chains are not true for nonreversible chains.


2004 ◽  
Vol 2004 (8) ◽  
pp. 421-429 ◽  
Author(s):  
Souad Assoudou ◽  
Belkheir Essebbar

This note is concerned with Bayesian estimation of the transition probabilities of a binary Markov chain observed from heterogeneous individuals. The model is founded on the Jeffreys' prior which allows for transition probabilities to be correlated. The Bayesian estimator is approximated by means of Monte Carlo Markov chain (MCMC) techniques. The performance of the Bayesian estimates is illustrated by analyzing a small simulated data set.


1982 ◽  
Vol 19 (3) ◽  
pp. 692-694 ◽  
Author(s):  
Mark Scott ◽  
Barry C. Arnold ◽  
Dean L. Isaacson

Characterizations of strong ergodicity for Markov chains using mean visit times have been found by several authors (Huang and Isaacson (1977), Isaacson and Arnold (1978)). In this paper a characterization of uniform strong ergodicity for a continuous-time non-homogeneous Markov chain is given. This extends the characterization, using mean visit times, that was given by Isaacson and Arnold.


1996 ◽  
Vol 33 (3) ◽  
pp. 640-653 ◽  
Author(s):  
Tobias Rydén

An aggregated Markov chain is a Markov chain for which some states cannot be distinguished from each other by the observer. In this paper we consider the identifiability problem for such processes in continuous time, i.e. the problem of determining whether two parameters induce identical laws for the observable process or not. We also study the order of a continuous-time aggregated Markov chain, which is the minimum number of states needed to represent it. In particular, we give a lower bound on the order. As a by-product, we obtain results of this kind also for Markov-modulated Poisson processes, i.e. doubly stochastic Poisson processes whose intensities are directed by continuous-time Markov chains, and phase-type distributions, which are hitting times in finite-state Markov chains.


1968 ◽  
Vol 5 (03) ◽  
pp. 669-678 ◽  
Author(s):  
Jozef L. Teugels

A general proposition is proved stating that the exponential ergodicity of a stationary Markov chain is preserved for derived Markov chains as defined by Cohen [2], [3]. An application to a certain type of continuous time Markov chains is included.


1993 ◽  
Vol 30 (3) ◽  
pp. 518-528 ◽  
Author(s):  
Frank Ball ◽  
Geoffrey F. Yeo

We consider lumpability for continuous-time Markov chains and provide a simple probabilistic proof of necessary and sufficient conditions for strong lumpability, valid in circumstances not covered by known theory. We also consider the following marginalisability problem. Let {X{t)} = {(X1(t), X2(t), · ··, Xm(t))} be a continuous-time Markov chain. Under what conditions are the marginal processes {X1(t)}, {X2(t)}, · ··, {Xm(t)} also continuous-time Markov chains? We show that this is related to lumpability and, if no two of the marginal processes can jump simultaneously, then they are continuous-time Markov chains if and only if they are mutually independent. Applications to ion channel modelling and birth–death processes are discussed briefly.


1998 ◽  
Vol 35 (01) ◽  
pp. 1-11 ◽  
Author(s):  
Gareth O. Roberts ◽  
Jeffrey S. Rosenthal ◽  
Peter O. Schwartz

In this paper, we consider the question of which convergence properties of Markov chains are preserved under small perturbations. Properties considered include geometric ergodicity and rates of convergence. Perturbations considered include roundoff error from computer simulation. We are motivated primarily by interest in Markov chain Monte Carlo algorithms.


1998 ◽  
Vol 35 (3) ◽  
pp. 545-556 ◽  
Author(s):  
Masaaki Kijima

A continuous-time Markov chain on the non-negative integers is called skip-free to the right (left) if only unit increments to the right (left) are permitted. If a Markov chain is skip-free both to the right and to the left, it is called a birth–death process. Karlin and McGregor (1959) showed that if a continuous-time Markov chain is monotone in the sense of likelihood ratio ordering then it must be an (extended) birth–death process. This paper proves that if an irreducible Markov chain in continuous time is monotone in the sense of hazard rate (reversed hazard rate) ordering then it must be skip-free to the right (left). A birth–death process is then characterized as a continuous-time Markov chain that is monotone in the sense of both hazard rate and reversed hazard rate orderings. As an application, the first-passage-time distributions of such Markov chains are also studied.


1989 ◽  
Vol 26 (3) ◽  
pp. 643-648 ◽  
Author(s):  
A. I. Zeifman

We consider a non-homogeneous continuous-time Markov chain X(t) with countable state space. Definitions of uniform and strong quasi-ergodicity are introduced. The forward Kolmogorov system for X(t) is considered as a differential equation in the space of sequences l1. Sufficient conditions for uniform quasi-ergodicity are deduced from this equation. We consider conditions of uniform and strong ergodicity in the case of proportional intensities.


Sign in / Sign up

Export Citation Format

Share Document