scholarly journals loopUI-0.1: uncertainty indicators to support needs and practices in 3D geological modelling uncertainty quantification

2022 ◽  
Author(s):  
Guillaume Pirot ◽  
Ranee Joshi ◽  
Jérémie Giraud ◽  
Mark Douglas Lindsay ◽  
Mark Walter Jessell

Abstract. To support the needs of practitioners regarding 3D geological modelling and uncertainty quantification in the field, in particular from the mining industry, we propose a Python package called loopUI-0.1 that provides a set of local and global indicators to measure uncertainty and features dissimilarities among an ensemble of voxet models. Results are presented of a survey launched among practitioners in the mineral industry, enquiring about their modelling and uncertainty quantification practice and needs. It reveals that practitioners acknowledge the importance of uncertainty quantification even if they do not perform it. Four main factors preventing practitioners to perform uncertainty quantification were identified: lack of data uncertainty quantification, (computing) time requirement to generate one model, poor tracking of assumptions and interpretations, relative complexity of uncertainty quantification. The paper reviews and proposes solutions to alleviate these issues. Elements of an answer to these problems are already provided in the special issue hosting this paper and more are expected to come.

2021 ◽  
Vol 11 (14) ◽  
pp. 6499
Author(s):  
Matthias Frankl ◽  
Mathieu Hursin ◽  
Dimitri Rochman ◽  
Alexander Vasiliev ◽  
Hakim Ferroukhi

Presently, a criticality safety evaluation methodology for the final geological disposal of Swiss spent nuclear fuel is under development at the Paul Scherrer Institute in collaboration with the Swiss National Technical Competence Centre in the field of deep geological disposal of radioactive waste. This method in essence pursues a best estimate plus uncertainty approach and includes burnup credit. Burnup credit is applied by means of a computational scheme called BUCSS-R (Burnup Credit System for the Swiss Reactors–Repository case) which is complemented by the quantification of uncertainties from various sources. BUCSS-R consists in depletion, decay and criticality calculations with CASMO5, SERPENT2 and MCNP6, respectively, determining the keff eigenvalues of the disposal canister loaded with the Swiss spent nuclear fuel assemblies. However, the depletion calculation in the first and the criticality calculation in the third step, in particular, are subject to uncertainties in the nuclear data input. In previous studies, the effects of these nuclear data-related uncertainties on obtained keff values, stemming from each of the two steps, have been quantified independently. Both contributions to the overall uncertainty in the calculated keff values have, therefore, been considered as fully correlated leading to an overly conservative estimation of total uncertainties. This study presents a consistent approach eliminating the need to assume and take into account unrealistically strong correlations in the keff results. The nuclear data uncertainty quantification for both depletion and criticality calculation is now performed at once using one and the same set of perturbation factors for uncertainty propagation through the corresponding calculation steps of the evaluation method. The present results reveal the overestimation of nuclear data-related uncertainties by the previous approach, in particular for spent nuclear fuel with a high burn-up, and underline the importance of consistent nuclear data uncertainty quantification methods. However, only canister loadings with UO2 fuel assemblies are considered, not offering insights into potentially different trends in nuclear data-related uncertainties for mixed oxide fuel assemblies.


Author(s):  
Zhen Hu ◽  
Sankaran Mahadevan ◽  
Xiaoping Du

Limited data of stochastic load processes and system random variables result in uncertainty in the results of time-dependent reliability analysis. An uncertainty quantification (UQ) framework is developed in this paper for time-dependent reliability analysis in the presence of data uncertainty. The Bayesian approach is employed to model the epistemic uncertainty sources in random variables and stochastic processes. A straightforward formulation of UQ in time-dependent reliability analysis results in a double-loop implementation procedure, which is computationally expensive. This paper proposes an efficient method for the UQ of time-dependent reliability analysis by integrating the fast integration method and surrogate model method with time-dependent reliability analysis. A surrogate model is built first for the time-instantaneous conditional reliability index as a function of variables with imprecise parameters. For different realizations of the epistemic uncertainty, the associated time-instantaneous most probable points (MPPs) are then identified using the fast integration method based on the conditional reliability index surrogate without evaluating the original limit-state function. With the obtained time-instantaneous MPPs, uncertainty in the time-dependent reliability analysis is quantified. The effectiveness of the proposed method is demonstrated using a mathematical example and an engineering application example.


2020 ◽  
Author(s):  
Zhouji Liang ◽  
Florian Wellmann

<p>Uncertainty quantification is an important aspect of geological modelling and model interpretation. Recent developments in geological modelling allow us to view the inversion as a problem in Bayesian inference, incorporating the uncertainties in the observations, the forward models and the prior knowledge from geologists. The sampling method Markov chain Monte Carlo (MCMC) is then often applied to solve this inference problem. However, this stochastic modelling approach is limited as the number of parameters increases to higher dimensions. To ensure an efficient sampling in a high dimensional problem, we take advantage of recent advances using Hessian-based MCMC methods in this work. The Hessian of the negative log posterior with respect to the input parameters is evaluated at the Maximum a Posteriori (MAP) point. A Laplace approximation of the posterior at the MAP is then given by the inverse of the local Hessian. This sampling approach provides a potentially less computationally expensive and more efficient way for high dimensional geological inverse modelling, especially in cases where parameters are highly correlated, a situation that commonly arises in geological modelling.</p>


SPE Journal ◽  
2011 ◽  
Vol 16 (02) ◽  
pp. 429-439 ◽  
Author(s):  
Heng Li ◽  
Pallav Sarma ◽  
Dongxiao Zhang

Summary Reservoir modeling and simulation are subject to significant uncertainty, which usually arises from heterogeneity of the geological formation and deficiency of measured data. Uncertainty quantification, thus, plays an important role in reservoir simulation. In order to perform accurate uncertainty analysis, a large number of simulations are often required. However, it is usually prohibitive to do so because even a single simulation of practical large-scale simulation models may be quite time consuming. Therefore, efficient approaches for uncertainty quantification are a necessity. The experimental-design (ED) method is applied widely in the petroleum industry for assessing uncertainties in reservoir production and economic appraisal. However, a key disadvantage of this approach is that it does not take into account the full probability-density functions (PDFs) of the input random parameters consistently—that is, the full PDFs are not used for sampling and design but used only during post-processing, and there is an inherent assumption that the distributions of these parameters are uniform (during sampling), which is rarely the case in reality. In this paper, we propose an approach to deal with arbitrary input probability distributions using the probabilistic-collocation method (PCM). Orthogonal polynomials for arbitrary distributions are first constructed numerically, and then PCM is used for uncertainty propagation. As a result, PCM can be applied efficiently for any arbitrary numerical or analytical distribution of the input parameters. It can be shown that PCM provides optimal convergence rates for linear models, whereas no such guarantees are provided by ED. The approach is also applicable to discrete distributions. PCM and ED are compared on a few synthetic and realistic reservoir models. Different types of PDFs are considered for a number of reservoir parameters. Results indicate that, while the computational efforts are greatly reduced compared to Monte Carlo (MC) simulation, PCM is able to accurately quantify uncertainty of various reservoir performance parameters. Results also reveal that PCM is more robust, more accurate, and more efficient than ED for uncertainty analysis.


Solid Earth ◽  
2019 ◽  
Vol 10 (1) ◽  
pp. 193-210 ◽  
Author(s):  
Jeremie Giraud ◽  
Mark Lindsay ◽  
Vitaliy Ogarko ◽  
Mark Jessell ◽  
Roland Martin ◽  
...  

Abstract. We introduce a workflow integrating geological modelling uncertainty information to constrain gravity inversions. We test and apply this approach to the Yerrida Basin (Western Australia), where we focus on prospective greenstone belts beneath sedimentary cover. Geological uncertainty information is extracted from the results of a probabilistic geological modelling process using geological field data and their inferred accuracy as inputs. The uncertainty information is utilized to locally adjust the weights of a minimum-structure gradient-based regularization function constraining geophysical inversion. Our results demonstrate that this technique allows geophysical inversion to update the model preferentially in geologically less certain areas. It also indicates that inverted models are consistent with both the probabilistic geological model and geophysical data of the area, reducing interpretation uncertainty. The interpretation of inverted models reveals that the recovered greenstone belts may be shallower and thinner than previously thought.


2013 ◽  
Vol 62 (4) ◽  
pp. 637-653 ◽  
Author(s):  
V. Kalantzis ◽  
C. Bekas ◽  
A. Curioni ◽  
E. Gallopoulos

2016 ◽  
Vol 17 (04) ◽  
pp. 1750062 ◽  
Author(s):  
TIEN TUAN DAO ◽  
MARIE-CHRISTINE HO BA THO

Uncertainty quantification in rigid musculoskeletal modeling is essential to analyze the risks related to the simulation outcomes. Data fusion from multiple sources is a potential solution to reduce data uncertainties. This present study aimed at proposing a new data fusion rule leading to a more consistent and coherent data for uncertainty quantification. Moreover, a new uncertainty representation was developed using imprecise probability approach. A biggest maximal coherent subsets (BMCS) operator was defined to fuse interval-valued data ranges from multiple sources. Fusion-based probability-box structure was developed to represent the data uncertainty. Case studies were performed for uncertainty propagation through inverse dynamics and static optimization algorithms. Hip joint moment and muscle force estimation were computed under effect of the uncertainties of thigh mass and muscle properties. Respective p-boxes of these properties were generated. Regarding the uncertainty propagation analysis, correlation coefficients showed a very good value ([Formula: see text]) for the proposed fusion operator according to classical operators. Muscle force variation of the rectus femoris was computed. Peak-to-peak (i.e., difference between maximal values) rectus femoris forces showed deviations of 55[Formula: see text]N and 40[Formula: see text]N for the first and second peaks, respectively. The development of the new fusion operator and fusion-based probability-box leads to a more consistent uncertainty quantification. This allows the estimation of risks associated with the simulation outcomes under input data uncertainties for rigid musculoskeletal modeling and simulation.


Author(s):  
Thomas Fisher ◽  
Harry Gibson ◽  
Gholamreza Salimi-Khorshidi ◽  
Abdelaali Hassaine ◽  
Yutong Cai ◽  
...  

Over a billion people live in slums, with poor sanitation, education, property rights and working conditions having direct impact on current residents and future generations. A key problem in relation to slums is slum mapping. Without delineations of where all slum settlements are, informed decisions cannot be made by policymakers in order to benefit the most in need. Satellite images have been used in combination with machine learning models to try and fill the gap in data availability of slum locations. Deep learning has been used on RGB images with some success but since labeled satellite images of slums are relatively low quality and the physical/visual manifestation of slums significantly varies within and across countries, it is important to quantify the uncertainty of predictions for reliable application in downstream tasks. Our solution is to train Monte Carlo dropout U-Net models on multispectral 13-band Sentinel-2 images from which we can calculate pixelwise epistemic (model) and aleatoric (data) uncertainty in our predictions. We trained our model on labelled images of Mumbai and verified our epistemic and aleatoric uncertainty quantification approach using altered models trained on modified datasets. We also used SHAP values to investigate how the different features contribute towards the model’s predictions and this showed that certain short-wave infrared and red-edge image bands are powerful features for determining the locations of slums within images. Having created our model with uncertainty quantification, in the future it can be applied to downstream tasks and decision-makers will know where predictions have been made with low uncertainty, giving them greater confidence in its deployment.


Sign in / Sign up

Export Citation Format

Share Document