scholarly journals High-Dimensional Filtering Using Nested Sequential Monte Carlo

2019 ◽  
Vol 67 (16) ◽  
pp. 4177-4188 ◽  
Author(s):  
Christian A. Naesseth ◽  
Fredrik Lindsten ◽  
Thomas B. Schon
2020 ◽  
Author(s):  
Sangeetika Ruchi ◽  
Svetlana Dubinkina ◽  
Jana de Wiljes

Abstract. Identification of unknown parameters on the basis of partial and noisy data is a challenging task in particular in high dimensional and nonlinear settings. Gaussian approximations to the problem, such as ensemble Kalman inversion, tend to be robust, computationally cheap and often produce astonishingly accurate estimations despite the inherently wrong underlying assumptions. Yet there is a lot of room for improvement specifically regarding the description of the associated statistics. The tempered ensemble transform particle filter is an adaptive sequential Monte Carlo method, where resampling is based on optimal transport mapping. Unlike ensemble Kalman inversion it does not require any assumptions regarding the posterior distribution and hence has shown to provide promising results for non-linear non-Gaussian inverse problems. However, the improved accuracy comes with the price of much higher computational complexity and the method is not as robust as the ensemble Kalman inversion in high dimensional problems. In this work, we add an entropy inspired regularisation factor to the underlying optimal transport problem that allows to considerably reduce the high computational cost via Sinkhorn iterations. Further, the robustness of the method is increased via an ensemble Kalman inversion proposal step before each update of the samples, which is also referred to as hybrid approach. The promising performance of the introduced method is numerically verified by testing it on a steady-state single-phase Darcy flow model with two different permeability configurations. The results are compared to the output of ensemble Kalman inversion, and Markov Chain Monte Carlo methods results are computed as a benchmark.


2021 ◽  
Vol 28 (1) ◽  
pp. 23-41
Author(s):  
Sangeetika Ruchi ◽  
Svetlana Dubinkina ◽  
Jana de Wiljes

Abstract. Identification of unknown parameters on the basis of partial and noisy data is a challenging task, in particular in high dimensional and non-linear settings. Gaussian approximations to the problem, such as ensemble Kalman inversion, tend to be robust and computationally cheap and often produce astonishingly accurate estimations despite the simplifying underlying assumptions. Yet there is a lot of room for improvement, specifically regarding a correct approximation of a non-Gaussian posterior distribution. The tempered ensemble transform particle filter is an adaptive Sequential Monte Carlo (SMC) method, whereby resampling is based on optimal transport mapping. Unlike ensemble Kalman inversion, it does not require any assumptions regarding the posterior distribution and hence has shown to provide promising results for non-linear non-Gaussian inverse problems. However, the improved accuracy comes with the price of much higher computational complexity, and the method is not as robust as ensemble Kalman inversion in high dimensional problems. In this work, we add an entropy-inspired regularisation factor to the underlying optimal transport problem that allows the high computational cost to be considerably reduced via Sinkhorn iterations. Further, the robustness of the method is increased via an ensemble Kalman inversion proposal step before each update of the samples, which is also referred to as a hybrid approach. The promising performance of the introduced method is numerically verified by testing it on a steady-state single-phase Darcy flow model with two different permeability configurations. The results are compared to the output of ensemble Kalman inversion, and Markov chain Monte Carlo methods results are computed as a benchmark.


2014 ◽  
Vol 46 (1) ◽  
pp. 279-306 ◽  
Author(s):  
Alexandros Beskos ◽  
Dan O. Crisan ◽  
Ajay Jasra ◽  
Nick Whiteley

In this paper we develop a collection of results associated to the analysis of the sequential Monte Carlo (SMC) samplers algorithm, in the context of high-dimensional independent and identically distributed target probabilities. The SMC samplers algorithm can be designed to sample from a single probability distribution, using Monte Carlo to approximate expectations with respect to this law. Given a target density in d dimensions our results are concerned with d → ∞, while the number of Monte Carlo samples, N, remains fixed. We deduce an explicit bound on the Monte-Carlo error for estimates derived using the SMC sampler and the exact asymptotic relative -error of the estimate of the normalising constant associated to the target. We also establish marginal propagation of chaos properties of the algorithm. These results are deduced when the cost of the algorithm is O(Nd2).


2014 ◽  
Vol 46 (01) ◽  
pp. 279-306 ◽  
Author(s):  
Alexandros Beskos ◽  
Dan O. Crisan ◽  
Ajay Jasra ◽  
Nick Whiteley

In this paper we develop a collection of results associated to the analysis of the sequential Monte Carlo (SMC) samplers algorithm, in the context of high-dimensional independent and identically distributed target probabilities. The SMC samplers algorithm can be designed to sample from a single probability distribution, using Monte Carlo to approximate expectations with respect to this law. Given a target density inddimensions our results are concerned withd→ ∞, while the number of Monte Carlo samples,N, remains fixed. We deduce an explicit bound on the Monte-Carlo error for estimates derived using the SMC sampler and the exact asymptotic relative-error of the estimate of the normalising constant associated to the target. We also establish marginal propagation of chaos properties of the algorithm. These results are deduced when the cost of the algorithm isO(Nd2).


2017 ◽  
Vol 49 (1) ◽  
pp. 24-48 ◽  
Author(s):  
Alexandros Beskos ◽  
Dan Crisan ◽  
Ajay Jasra ◽  
Kengo Kamatani ◽  
Yan Zhou

Abstract We consider the numerical approximation of the filtering problem in high dimensions, that is, when the hidden state lies in ℝd with large d. For low-dimensional problems, one of the most popular numerical procedures for consistent inference is the class of approximations termed particle filters or sequential Monte Carlo methods. However, in high dimensions, standard particle filters (e.g. the bootstrap particle filter) can have a cost that is exponential in d for the algorithm to be stable in an appropriate sense. We develop a new particle filter, called the space‒time particle filter, for a specific family of state-space models in discrete time. This new class of particle filters provides consistent Monte Carlo estimates for any fixed d, as do standard particle filters. Moreover, when there is a spatial mixing element in the dimension of the state vector, the space‒time particle filter will scale much better with d than the standard filter for a class of filtering problems. We illustrate this analytically for a model of a simple independent and identically distributed structure and a model of an L-Markovian structure (L≥ 1, L independent of d) in the d-dimensional space direction, when we show that the algorithm exhibits certain stability properties as d increases at a cost 𝒪(nNd2), where n is the time parameter and N is the number of Monte Carlo samples, which are fixed and independent of d. Our theoretical results are also supported by numerical simulations on practical models of complex structures. The results suggest that it is indeed possible to tackle some high-dimensional filtering problems using the space‒time particle filter that standard particle filters cannot handle.


Author(s):  
Edward P. Herbst ◽  
Frank Schorfheide

Dynamic stochastic general equilibrium (DSGE) models have become one of the workhorses of modern macroeconomics and are extensively used for academic research as well as forecasting and policy analysis at central banks. This book introduces readers to state-of-the-art computational techniques used in the Bayesian analysis of DSGE models. The book covers Markov chain Monte Carlo techniques for linearized DSGE models, novel sequential Monte Carlo methods that can be used for parameter inference, and the estimation of nonlinear DSGE models based on particle filter approximations of the likelihood function. The theoretical foundations of the algorithms are discussed in depth, and detailed empirical applications and numerical illustrations are provided. The book also gives invaluable advice on how to tailor these algorithms to specific applications and assess the accuracy and reliability of the computations. The book is essential reading for graduate students, academic researchers, and practitioners at policy institutions.


Sign in / Sign up

Export Citation Format

Share Document