Markov Processes and Random Sampling

Author(s):  
Ulf Grenander ◽  
Michael I. Miller

The parameter spaces of natural patterns are so complex that inference must often proceed compositionally, successively building up more and more complex structures, as well as back-tracking, creating simpler structures from more complex versions. Inference is transformational in nature. The philosophical approach studied in this chapter is that the posterior distribution that describes the patterns contains all of the information about the underlying regular structure. Therefore, the transformations of inference are guided via the posterior in the sense that the algorithm for changing the regular structures will correspond to the sample path of a Markov process. The Markov process is constructed to push towards the posterior distribution in which the information about the patterns are stored. This provides the deepconnection between the transformational paradigm of regular structure creation, and random sampling algorithms.

1969 ◽  
Vol 36 ◽  
pp. 1-26 ◽  
Author(s):  
Hiroshi Kunita

Let (x, ζ,ℬt,Px) be a (standard) Markov process with state space 5 defined on the abstract space Ω. Here, xt is the sample path, ζ is the terminal time and ℬt is the smallest α-field of Ω in which xs s ≤ t are measurable. Let P’x, x ∈ S be another family of Markovian measures defined on (ℬt, Ω).


Genetics ◽  
1974 ◽  
Vol 76 (2) ◽  
pp. 367-377
Author(s):  
Takeo Maruyama

ABSTRACT A Markov process (chain) of gene frequency change is derived for a geographically-structured model of a population. The population consists of colonies which are connected by migration. Selection operates in each colony independently. It is shown that there exists a stochastic clock that transforms the originally complicated process of gene frequency change to a random walk which is independent of the geographical structure of the population. The time parameter is a local random time that is dependent on the sample path. In fact, if the alleles are selectively neutral, the time parameter is exactly equal to the sum of the average local genetic variation appearing in the population, and otherwise they are approximately equal. The Kolmogorov forward and backward equations of the process are obtained. As a limit of large population size, a diffusion process is derived. The transition probabilities of the Markov chain and of the diffusion process are obtained explicitly. Certain quantities of biological interest are shown to be independent of the population structure. The quantities are the fixation probability of a mutant, the sum of the average local genetic variation and the variation summed over the generations in which the gene frequency in the whole population assumes a specified value.


Author(s):  
UWE FRANZ

We show how classical Markov processes can be obtained from quantum Lévy processes. It is shown that quantum Lévy processes are quantum Markov processes, and sufficient conditions for restrictions to subalgebras to remain quantum Markov processes are given. A classical Markov process (which has the same time-ordered moments as the quantum process in the vacuum state) exists whenever we can restrict to a commutative subalgebra without losing the quantum Markov property.8 Several examples, including the Azéma martingale, with explicit calculations are presented. In particular, the action of the generator of the classical Markov processes on polynomials or their moments are calculated using Hopf algebra duality.


2020 ◽  
Vol 57 (4) ◽  
pp. 1045-1069
Author(s):  
Matija Vidmar

AbstractFor a spectrally negative self-similar Markov process on $[0,\infty)$ with an a.s. finite overall supremum, we provide, in tractable detail, a kind of conditional Wiener–Hopf factorization at the maximum of the absorption time at zero, the conditioning being on the overall supremum and the jump at the overall supremum. In a companion result the Laplace transform of this absorption time (on the event that the process does not go above a given level) is identified under no other assumptions (such as the process admitting a recurrent extension and/or hitting zero continuously), generalizing some existing results in the literature.


1999 ◽  
Vol 36 (01) ◽  
pp. 48-59 ◽  
Author(s):  
George V. Moustakides

Let ξ0,ξ1,ξ2,… be a homogeneous Markov process and let S n denote the partial sum S n = θ(ξ1) + … + θ(ξ n ), where θ(ξ) is a scalar nonlinearity. If N is a stopping time with 𝔼N < ∞ and the Markov process satisfies certain ergodicity properties, we then show that 𝔼S N = [lim n→∞𝔼θ(ξ n )]𝔼N + 𝔼ω(ξ0) − 𝔼ω(ξ N ). The function ω(ξ) is a well defined scalar nonlinearity directly related to θ(ξ) through a Poisson integral equation, with the characteristic that ω(ξ) becomes zero in the i.i.d. case. Consequently our result constitutes an extension to Wald's first lemma for the case of Markov processes. We also show that, when 𝔼N → ∞, the correction term is negligible as compared to 𝔼N in the sense that 𝔼ω(ξ0) − 𝔼ω(ξ N ) = o(𝔼N).


1970 ◽  
Vol 7 (2) ◽  
pp. 400-410 ◽  
Author(s):  
Tore Schweder

Many phenomena studied in the social sciences and elsewhere are complexes of more or less independent characteristics which develop simultaneously. Such phenomena may often be realistically described by time-continuous finite Markov processes. In order to define such a model which will take care of all the relevant a priori information, there ought to be a way of defining a Markov process as a vector of components representing the various characteristics constituting the phenomenon such that the dependences between the characteristics are represented by explicit requirements on the Markov process, preferably on its infinitesimal generator.


1993 ◽  
Vol 6 (4) ◽  
pp. 385-406 ◽  
Author(s):  
N. U. Ahmed ◽  
Xinhong Ding

We consider a nonlinear (in the sense of McKean) Markov process described by a stochastic differential equations in Rd. We prove the existence and uniqueness of invariant measures of such process.


Author(s):  
Ulf Grenander ◽  
Michael I. Miller

This chapter explores random sampling algorithms introduced in for generating conditional expectations in hypothesis spaces in which there is a mixture of discrete, disconnected subsets. Random samples are generated via the direct simulation of a Markov process whose state moves through the hypothesis space with the ergodic property that the transition distribution of the Markov process converges to the posterior distribution. This allows for the empirical generation of conditional expectations under the posterior. To accommodate the connected and disconnected nature of the state spaces, the Markov process is forced to satisfy jump–diffusion dynamics. Through the connected parts of the parameter space (Lie manifolds) the algorithm searches continuously, with sample paths corresponding to solutions of standard diffusion equations. Across the disconnected parts of parameter space the jump process determines the dynamics. The infinitesimal properties of these jump–diffusion processes are selected so that various sample statistics converge to their expectation under the posterior.


1970 ◽  
Vol 7 (02) ◽  
pp. 400-410 ◽  
Author(s):  
Tore Schweder

Many phenomena studied in the social sciences and elsewhere are complexes of more or less independent characteristics which develop simultaneously. Such phenomena may often be realistically described by time-continuous finite Markov processes. In order to define such a model which will take care of all the relevant a priori information, there ought to be a way of defining a Markov process as a vector of components representing the various characteristics constituting the phenomenon such that the dependences between the characteristics are represented by explicit requirements on the Markov process, preferably on its infinitesimal generator.


1966 ◽  
Vol 3 (1) ◽  
pp. 48-54 ◽  
Author(s):  
William F. Massy

Most empirical work on Markov processes for brand choice has been based on aggregative data. This article explores the validity of the crucial assumption that underlies such analyses, i.e., that all the families in the sample follow a Markov process with the same or similar transition probability matrices. The results show that there is a great deal of diversity among families’ switching processes, and that many of them are of zero rather than first order.


Sign in / Sign up

Export Citation Format

Share Document