scholarly journals Evaluation of Mineralogy per Geological Layers by Approximate Bayesian Computation

SPE Journal ◽  
2020 ◽  
Vol 25 (05) ◽  
pp. 2418-2432
Author(s):  
Vianney Bruned ◽  
Alice Cleynen ◽  
André Mas ◽  
Sylvain Wlodarczyk

Summary We propose a new three-step methodology to perform an automated mineralogical inversion from wellbore logs. The approach is derived from a Bayesian linear-regression model with no prior knowledge of the mineral composition of the rock. The first step makes use of approximate Bayesian computation (ABC) for each depth sample to evaluate all the possible mineral proportions that are consistent with the measured log responses. The second step gathers these candidates for a given stratum and computes through a density-based clustering algorithm the most probable mineralogical compositions. Finally, for each stratum and for the most probable combinations, a mineralogical inversion is performed with an associated confidence estimate. The advantage of this approach is to explore all possible mineralogy hypotheses that match the wellbore data. This pipeline is tested on both synthetic and real data sets.

2013 ◽  
Vol 2013 ◽  
pp. 1-10 ◽  
Author(s):  
Tom Burr ◽  
Alexei Skurikhin

Approximate Bayesian computation (ABC) is an approach for using measurement data to calibrate stochastic computer models, which are common in biology applications. ABC is becoming the “go-to” option when the data and/or parameter dimension is large because it relies on user-chosen summary statistics rather than the full data and is therefore computationally feasible. One technical challenge with ABC is that the quality of the approximation to the posterior distribution of model parameters depends on the user-chosen summary statistics. In this paper, the user requirement to choose effective summary statistics in order to accurately estimate the posterior distribution of model parameters is investigated and illustrated by example, using a model and corresponding real data of mitochondrial DNA population dynamics. We show that for some choices of summary statistics, the posterior distribution of model parameters is closely approximated and for other choices of summary statistics, the posterior distribution is not closely approximated. A strategy to choose effective summary statistics is suggested in cases where the stochastic computer model can be run at many trial parameter settings, as in the example.


Author(s):  
Cecilia Viscardi ◽  
Michele Boreale ◽  
Fabio Corradi

AbstractWe consider the problem of sample degeneracy in Approximate Bayesian Computation. It arises when proposed values of the parameters, once given as input to the generative model, rarely lead to simulations resembling the observed data and are hence discarded. Such “poor” parameter proposals do not contribute at all to the representation of the parameter’s posterior distribution. This leads to a very large number of required simulations and/or a waste of computational resources, as well as to distortions in the computed posterior distribution. To mitigate this problem, we propose an algorithm, referred to as the Large Deviations Weighted Approximate Bayesian Computation algorithm, where, via Sanov’s Theorem, strictly positive weights are computed for all proposed parameters, thus avoiding the rejection step altogether. In order to derive a computable asymptotic approximation from Sanov’s result, we adopt the information theoretic “method of types” formulation of the method of Large Deviations, thus restricting our attention to models for i.i.d. discrete random variables. Finally, we experimentally evaluate our method through a proof-of-concept implementation.


2021 ◽  
Vol 62 (2) ◽  
Author(s):  
Jason D. Christopher ◽  
Olga A. Doronina ◽  
Dan Petrykowski ◽  
Torrey R. S. Hayden ◽  
Caelan Lapointe ◽  
...  

Entropy ◽  
2021 ◽  
Vol 23 (3) ◽  
pp. 312
Author(s):  
Ilze A. Auzina ◽  
Jakub M. Tomczak

Many real-life processes are black-box problems, i.e., the internal workings are inaccessible or a closed-form mathematical expression of the likelihood function cannot be defined. For continuous random variables, likelihood-free inference problems can be solved via Approximate Bayesian Computation (ABC). However, an optimal alternative for discrete random variables is yet to be formulated. Here, we aim to fill this research gap. We propose an adjusted population-based MCMC ABC method by re-defining the standard ABC parameters to discrete ones and by introducing a novel Markov kernel that is inspired by differential evolution. We first assess the proposed Markov kernel on a likelihood-based inference problem, namely discovering the underlying diseases based on a QMR-DTnetwork and, subsequently, the entire method on three likelihood-free inference problems: (i) the QMR-DT network with the unknown likelihood function, (ii) the learning binary neural network, and (iii) neural architecture search. The obtained results indicate the high potential of the proposed framework and the superiority of the new Markov kernel.


Author(s):  
Cesar A. Fortes‐Lima ◽  
Romain Laurent ◽  
Valentin Thouzeau ◽  
Bruno Toupance ◽  
Paul Verdu

2014 ◽  
Vol 64 (3) ◽  
pp. 416-431 ◽  
Author(s):  
C. Baudet ◽  
B. Donati ◽  
B. Sinaimeri ◽  
P. Crescenzi ◽  
C. Gautier ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document