scholarly journals Bifidelity Gradient-Based Approach for Nonlinear Well-Logging Inverse Problems

Author(s):  
Han Lu ◽  
Qiuyang Shen ◽  
Jiefu Chen ◽  
Xuqing Wu ◽  
Xin Fu ◽  
...  
2021 ◽  
Author(s):  
Lars Gebraad ◽  
Sölvi Thrastarson ◽  
Andrea Zunino ◽  
Andreas Fichtner

<p><span>Uncertainty quantification is an essential part of many studies in Earth science. It allows us, for example, to assess the quality of tomographic reconstructions, quantify hypotheses and make physics-based risk assessments. In recent years there has been a surge in applications of uncertainty quantification in seismological inverse problems. This is mainly due to increasing computational power and the ‘discovery’ of optimal use cases for many algorithms (e.g., gradient-based Markov Chain Monte Carlo (MCMC). Performing Bayesian inference using these methods allows seismologists to perform advanced uncertainty quantification. However, oftentimes, Bayesian inference is still prohibitively expensive due to large parameter spaces and computationally expensive physics.</span></p><p><span>Simultaneously, machine learning has found its way into parameter estimation in geosciences. Recent works show that machine learning both allows one to accelerate repetitive inferences [e.g. </span>Shahraeeni & Curtis 2011, <span>Cao et al. 2020] as well as speed up single-instance Monte Carlo algorithms </span><span>using surrogate networks </span><span>[Aleardi 2020]. These advances allow seismologists to use machine learning as a tool to bring accurate inference on the subsurface to scale.</span></p><p>In this work, we propose the novel inclusion of adjoint modelling in machine learning accelerated inverse problems. The aforementioned references train machine learning models on observations of the misfit function. This is done with the aim of creating surrogate but accelerated models for the misfit computations, which in turn allows one to compute this function and its gradients much faster. This approach ignores that many physical models have an adjoint state, allowing one to compute gradients using only one additional simulation.</p><p>The inclusion of this information within gradient-based sampling creates performance gains in both training the surrogate and the sampling of the true posterior. We show how machine learning models that approximate misfits and gradients specifically trained using adjoint methods accelerate various types of inversions and bring Bayesian inference to scale. Practically, the proposed method simply allows us to utilize information from previous MCMC samples in the algorithm proposal step.</p><p>The application of the proposed machinery is in settings where models are extensively and repetitively run. Markov chain Monte Carlo algorithms, which may require millions of evaluations of the forward modelling equations, can be accelerated by off-loading these simulations to neural nets. This approach is also promising for tomographic monitoring, where experiments are repeatedly performed. Lastly, the efficiently trained neural nets can be used to learn a likelihood for a given dataset, to which subsequently different priors can be efficiently applied.<span> We show examples of all these use cases.</span></p><p> </p><p>Lars Gebraad, Christian Boehm and Andreas Fichtner, 2020: Bayesian Elastic Full‐Waveform Inversion Using Hamiltonian Monte Carlo.</p><p>Ruikun Cao, Stephanie Earp, Sjoerd A. L. de Ridder, Andrew Curtis, and Erica Galetti, 2020: Near-real-time near-surface 3D seismic velocity and uncertainty models by wavefield gradiometry and neural network inversion of ambient seismic noise.</p><p>Mohammad S. Shahraeeni and Andrew Curtis, 2011: Fast probabilistic nonlinear petrophysical inversion.</p><p><span>Mattia Aleardi, 2020: Combining discrete cosine transform and convolutional neural networks to speed up the Hamiltonian Monte Carlo inversion of pre‐stack seismic data.</span></p>


2020 ◽  
Author(s):  
Lars Gebraad ◽  
Andrea Zunino ◽  
Andreas Fichtner ◽  
Klaus Mosegaard

<div>We present a framework to solve geophysical inverse problems using the Hamiltonian Monte Carlo (HMC) method, with a focus on Bayesian tomography. Recent work in the geophysical community has shown the potential for gradient-based Monte Carlo sampling for a wide range of inverse problems across several fields.</div><div> </div><div>Many high-dimensional (non-linear) problems in geophysics have readily accessible gradient information which is unused in classical probabilistic inversions. Using HMC is a way to help improve traditional Monte Carlo sampling while increasing the scalability of inference problems, allowing access to uncertainty quantification for problems with many free parameters (>10'000). The result of HMC sampling is a collection of models representing the posterior probability density function, from which not only "best" models can be inferred, but also uncertainties and potentially different plausible scenarios, all compatible with the observed data. However, the amount of tuning parameters required by HMC, as well as the complexity of existing statistical modeling software, has limited the geophysical community in widely adopting a specific tool for performing efficient large-scale Bayesian inference.</div><div> </div><div>This work attempts to make a step towards filling that gap by providing an HMC sampler tailored for geophysical inverse problems (by e.g. supplying relevant priors and visualizations) combined with a set of different forward models, ranging from elastic and acoustic wave propagation to magnetic anomaly modeling, traveltimes, etc.. The framework is coded in the didactic but performant languages Julia and Python, with the possibility for the user to combine their own forward models, which are linked to the sampler routines by proper interfaces. In this way, we hope to illustrate the usefulness and potential of HMC in Bayesian inference. Tutorials featuring an array of physical experiments are written with the aim of both showcasing Bayesian inference and successful HMC usage. It additionally includes examples on how to speed up HMC e.g. with automated tuning techniques and GPU computations.</div>


2017 ◽  
Vol 40 (1) ◽  
pp. 27-50 ◽  
Author(s):  
P. Landkamer ◽  
B. Söhngen ◽  
P. Steinmann ◽  
K. Willner

2017 ◽  
Vol 8 (2) ◽  
pp. 219-239 ◽  
Author(s):  
Sergey Voronin ◽  
Christophe Zaroli ◽  
Naresh P. Cuntoor

Sign in / Sign up

Export Citation Format

Share Document