scholarly journals Multivariate normal approximation with Stein’s method of exchangeable pairs under a general linearity condition

2009 ◽  
Vol 37 (6) ◽  
pp. 2150-2173 ◽  
Author(s):  
Gesine Reinert ◽  
Adrian Röllin
1996 ◽  
Vol 33 (01) ◽  
pp. 1-17 ◽  
Author(s):  
Larry Goldstein ◽  
Yosef Rinott

Stein's method is used to obtain two theorems on multivariate normal approximation. Our main theorem, Theorem 1.2, provides a bound on the distance to normality for any non-negative random vector. Theorem 1.2 requires multivariate size bias coupling, which we discuss in studying the approximation of distributions of sums of dependent random vectors. In the univariate case, we briefly illustrate this approach for certain sums of nonlinear functions of multivariate normal variables. As a second illustration, we show that the multivariate distribution counting the number of vertices with given degrees in certain random graphs is asymptotically multivariate normal and obtain a bound on the rate of convergence. Both examples demonstrate that this approach may be suitable for situations involving non-local dependence. We also present Theorem 1.4 for sums of vectors having a local type of dependence. We apply this theorem to obtain a multivariate normal approximation for the distribution of the random p-vector, which counts the number of edges in a fixed graph both of whose vertices have the same given color when each vertex is colored by one of p colors independently. All normal approximation results presented here do not require an ordering of the summands related to the dependence structure. This is in contrast to hypotheses of classical central limit theorems and examples, which involve for example, martingale, Markov chain or various mixing assumptions.


1996 ◽  
Vol 33 (1) ◽  
pp. 1-17 ◽  
Author(s):  
Larry Goldstein ◽  
Yosef Rinott

Stein's method is used to obtain two theorems on multivariate normal approximation. Our main theorem, Theorem 1.2, provides a bound on the distance to normality for any non-negative random vector. Theorem 1.2 requires multivariate size bias coupling, which we discuss in studying the approximation of distributions of sums of dependent random vectors. In the univariate case, we briefly illustrate this approach for certain sums of nonlinear functions of multivariate normal variables. As a second illustration, we show that the multivariate distribution counting the number of vertices with given degrees in certain random graphs is asymptotically multivariate normal and obtain a bound on the rate of convergence. Both examples demonstrate that this approach may be suitable for situations involving non-local dependence. We also present Theorem 1.4 for sums of vectors having a local type of dependence. We apply this theorem to obtain a multivariate normal approximation for the distribution of the random p-vector, which counts the number of edges in a fixed graph both of whose vertices have the same given color when each vertex is colored by one of p colors independently. All normal approximation results presented here do not require an ordering of the summands related to the dependence structure. This is in contrast to hypotheses of classical central limit theorems and examples, which involve for example, martingale, Markov chain or various mixing assumptions.


2009 ◽  
Vol 147 (1) ◽  
pp. 95-114 ◽  
Author(s):  
ADAM J. HARPER

AbstractIn this paper, we apply Stein's method for distributional approximations to prove a quantitative form of the Erdös–Kac Theorem. We obtain our best bound on the rate of convergence, on the order of log log log n (log log n)−1/2, by making an intermediate Poisson approximation; we believe that this approach is simpler and more probabilistic than others, and we also obtain an explicit numerical value for the constant implicit in the bound. Different ways of applying Stein's method to prove the Erdös–Kac Theorem are discussed, including a Normal approximation argument via exchangeable pairs, where the suitability of a Poisson approximation naturally suggests itself.


2010 ◽  
Vol 47 (2) ◽  
pp. 378-393 ◽  
Author(s):  
Gesine Reinert ◽  
Adrian Röllin

In Reinert and Röllin (2009) a new approach - called the ‘embedding method’ - was introduced, which allows us to make use of exchangeable pairs for normal and multivariate normal approximations with Stein's method in cases where the corresponding couplings do not satisfy a certain linearity condition. The key idea is to embed the problem into a higher-dimensional space in such a way that the linearity condition is then satisfied. Here we apply the embedding to U-statistics as well as to subgraph counts in random graphs.


2009 ◽  
Vol 18 (6) ◽  
pp. 979-1017
Author(s):  
NATHAN ROSS

Stein's method of exchangeable pairs is examined through five examples in relation to Poisson and normal distribution approximation. In particular, in the case where the exchangeable pair is constructed from a reversible Markov chain, we analyse how modifying the step size of the chain in a natural way affects the error term in the approximation acquired through Stein's method. It has been noted for the normal approximation that smaller step sizes may yield better bounds, and we obtain the first rigorous results that verify this intuition. For the examples associated to the normal distribution, the bound on the error is expressed in terms of the spectrum of the underlying chain, a characteristic of the chain related to convergence rates. The Poisson approximation using exchangeable pairs is less studied than the normal, but in the examples presented here the same principles hold.


2010 ◽  
Vol 47 (02) ◽  
pp. 378-393
Author(s):  
Gesine Reinert ◽  
Adrian Röllin

In Reinert and Röllin (2009) a new approach - called the ‘embedding method’ - was introduced, which allows us to make use of exchangeable pairs for normal and multivariate normal approximations with Stein's method in cases where the corresponding couplings do not satisfy a certain linearity condition. The key idea is to embed the problem into a higher-dimensional space in such a way that the linearity condition is then satisfied. Here we apply the embedding to U-statistics as well as to subgraph counts in random graphs.


Sign in / Sign up

Export Citation Format

Share Document