Low-Rank Independence Samplers in Hierarchical Bayesian Inverse Problems

2018 ◽  
Vol 6 (3) ◽  
pp. 1076-1100 ◽  
Author(s):  
D. Andrew Brown ◽  
Arvind Saibaba ◽  
Sarah Vallélian
2021 ◽  
Vol 47 (2) ◽  
pp. 1-34
Author(s):  
Umberto Villa ◽  
Noemi Petra ◽  
Omar Ghattas

We present an extensible software framework, hIPPYlib, for solution of large-scale deterministic and Bayesian inverse problems governed by partial differential equations (PDEs) with (possibly) infinite-dimensional parameter fields (which are high-dimensional after discretization). hIPPYlib overcomes the prohibitively expensive nature of Bayesian inversion for this class of problems by implementing state-of-the-art scalable algorithms for PDE-based inverse problems that exploit the structure of the underlying operators, notably the Hessian of the log-posterior. The key property of the algorithms implemented in hIPPYlib is that the solution of the inverse problem is computed at a cost, measured in linearized forward PDE solves, that is independent of the parameter dimension. The mean of the posterior is approximated by the MAP point, which is found by minimizing the negative log-posterior with an inexact matrix-free Newton-CG method. The posterior covariance is approximated by the inverse of the Hessian of the negative log posterior evaluated at the MAP point. The construction of the posterior covariance is made tractable by invoking a low-rank approximation of the Hessian of the log-likelihood. Scalable tools for sample generation are also discussed. hIPPYlib makes all of these advanced algorithms easily accessible to domain scientists and provides an environment that expedites the development of new algorithms.


2019 ◽  
Vol 7 (3) ◽  
pp. 1105-1131 ◽  
Author(s):  
Arvind K. Saibaba ◽  
Johnathan Bardsley ◽  
D. Andrew Brown ◽  
Alen Alexanderian

Author(s):  
Yuming Ba ◽  
Jana de Wiljes ◽  
Dean S. Oliver ◽  
Sebastian Reich

AbstractMinimization of a stochastic cost function is commonly used for approximate sampling in high-dimensional Bayesian inverse problems with Gaussian prior distributions and multimodal posterior distributions. The density of the samples generated by minimization is not the desired target density, unless the observation operator is linear, but the distribution of samples is useful as a proposal density for importance sampling or for Markov chain Monte Carlo methods. In this paper, we focus on applications to sampling from multimodal posterior distributions in high dimensions. We first show that sampling from multimodal distributions is improved by computing all critical points instead of only minimizers of the objective function. For applications to high-dimensional geoscience inverse problems, we demonstrate an efficient approximate weighting that uses a low-rank Gauss-Newton approximation of the determinant of the Jacobian. The method is applied to two toy problems with known posterior distributions and a Darcy flow problem with multiple modes in the posterior.


2020 ◽  
Vol 42 (6) ◽  
pp. A3761-A3784
Author(s):  
Daniela Calvetti ◽  
Monica Pragliola ◽  
Erkki Somersalo

2021 ◽  
Vol 427 ◽  
pp. 110055
Author(s):  
Aaron Myers ◽  
Alexandre H. Thiéry ◽  
Kainan Wang ◽  
Tan Bui-Thanh

Sign in / Sign up

Export Citation Format

Share Document