Improving High-Dimensional Physics Models Through Bayesian Calibration With Uncertain Data

Author(s):  
Natarajan Chennimalai Kumar ◽  
Arun K. Subramaniyan ◽  
Liping Wang

We address the problem of calibrating model parameters in computational models to match uncertain and limited experimental data using a Bayesian framework. We employ a modified version of the Bayesian calibration framework proposed by Kennedy and O’Hagan [15], to perform calibration of large dimensional industrial problems. Results for two nonlinear industrial problems with 15 and 100 calibration parameters are presented. The unique advantages of the Bayesian framework are presented along with a discussion on the challenges in calibrating large number of parameters with uncertain and limited data.

2013 ◽  
Vol 4 (1) ◽  
pp. 1-30 ◽  
Author(s):  
Peter C. R. Lane ◽  
Fernand Gobet

Abstract Creating robust, reproducible and optimal computational models is a key challenge for theorists in many sciences. Psychology and cognitive science face particular challenges as large amounts of data are collected and many models are not amenable to analytical techniques for calculating parameter sets. Particular problems are to locate the full range of acceptable model parameters for a given dataset, and to confirm the consistency of model parameters across different datasets. Resolving these problems will provide a better understanding of the behaviour of computational models, and so support the development of general and robust models. In this article, we address these problems using evolutionary algorithms to develop parameters for computational models against multiple sets of experimental data; in particular, we propose the ‘speciated non-dominated sorting genetic algorithm’ for evolving models in several theories. We discuss the problem of developing a model of categorisation using twenty-nine sets of data and models drawn from four different theories. We find that the evolutionary algorithms generate high quality models, adapted to provide a good fit to all available data.


Acta Numerica ◽  
2018 ◽  
Vol 27 ◽  
pp. 353-450 ◽  
Author(s):  
J. Tinsley Oden

The use of computational models and simulations to predict events that take place in our physical universe, or to predict the behaviour of engineered systems, has significantly advanced the pace of scientific discovery and the creation of new technologies for the benefit of humankind over recent decades, at least up to a point. That ‘point’ in recent history occurred around the time that the scientific community began to realize that true predictive science must deal with many formidable obstacles, including the determination of the reliability of the models in the presence of many uncertainties. To develop meaningful predictions one needs relevant data, itself possessing uncertainty due to experimental noise; in addition, one must determine model parameters, and concomitantly, there is the overriding need to select and validate models given the data and the goals of the simulation.This article provides a broad overview of predictive computational science within the framework of what is often called the science of uncertainty quantification. The exposition is divided into three major parts. In Part 1, philosophical and statistical foundations of predictive science are developed within a Bayesian framework. There the case is made that the Bayesian framework provides, perhaps, a unique setting for handling all of the uncertainties encountered in scientific prediction. In Part 2, general frameworks and procedures for the calculation and validation of mathematical models of physical realities are given, all in a Bayesian setting. But beyond Bayes, an introduction to information theory, the maximum entropy principle, model sensitivity analysis and sampling methods such as MCMC are presented. In Part 3, the central problem of predictive computational science is addressed: the selection, adaptive control and validation of mathematical and computational models of complex systems. The Occam Plausibility Algorithm, OPAL, is introduced as a framework for model selection, calibration and validation. Applications to complex models of tumour growth are discussed.


1992 ◽  
Vol 23 (2) ◽  
pp. 89-104 ◽  
Author(s):  
Ole H. Jacobsen ◽  
Feike J. Leij ◽  
Martinus Th. van Genuchten

Breakthrough curves of Cl and 3H2O were obtained during steady unsaturated flow in five lysimeters containing an undisturbed coarse sand (Orthic Haplohumod). The experimental data were analyzed in terms of the classical two-parameter convection-dispersion equation and a four-parameter two-region type physical nonequilibrium solute transport model. Model parameters were obtained by both curve fitting and time moment analysis. The four-parameter model provided a much better fit to the data for three soil columns, but performed only slightly better for the two remaining columns. The retardation factor for Cl was about 10 % less than for 3H2O, indicating some anion exclusion. For the four-parameter model the average immobile water fraction was 0.14 and the Peclet numbers of the mobile region varied between 50 and 200. Time moments analysis proved to be a useful tool for quantifying the break through curve (BTC) although the moments were found to be sensitive to experimental scattering in the measured data at larger times. Also, fitted parameters described the experimental data better than moment generated parameter values.


Energies ◽  
2020 ◽  
Vol 13 (17) ◽  
pp. 4290
Author(s):  
Dongmei Zhang ◽  
Yuyang Zhang ◽  
Bohou Jiang ◽  
Xinwei Jiang ◽  
Zhijiang Kang

Reservoir history matching is a well-known inverse problem for production prediction where enormous uncertain reservoir parameters of a reservoir numerical model are optimized by minimizing the misfit between the simulated and history production data. Gaussian Process (GP) has shown promising performance for assisted history matching due to the efficient nonparametric and nonlinear model with few model parameters to be tuned automatically. Recently introduced Gaussian Processes proxy models and Variogram Analysis of Response Surface-based sensitivity analysis (GP-VARS) uses forward and inverse Gaussian Processes (GP) based proxy models with the VARS-based sensitivity analysis to optimize the high-dimensional reservoir parameters. However, the inverse GP solution (GPIS) in GP-VARS are unsatisfactory especially for enormous reservoir parameters where the mapping from low-dimensional misfits to high-dimensional uncertain reservoir parameters could be poorly modeled by GP. To improve the performance of GP-VARS, in this paper we propose the Gaussian Processes proxy models with Latent Variable Models and VARS-based sensitivity analysis (GPLVM-VARS) where Gaussian Processes Latent Variable Model (GPLVM)-based inverse solution (GPLVMIS) instead of GP-based GPIS is provided with the inputs and outputs of GPIS reversed. The experimental results demonstrate the effectiveness of the proposed GPLVM-VARS in terms of accuracy and complexity. The source code of the proposed GPLVM-VARS is available at https://github.com/XinweiJiang/GPLVM-VARS.


Author(s):  
Afshin Anssari-Benam ◽  
Andrea Bucchi ◽  
Giuseppe Saccomandi

AbstractThe application of a newly proposed generalised neo-Hookean strain energy function to the inflation of incompressible rubber-like spherical and cylindrical shells is demonstrated in this paper. The pressure ($P$ P ) – inflation ($\lambda $ λ or $v$ v ) relationships are derived and presented for four shells: thin- and thick-walled spherical balloons, and thin- and thick-walled cylindrical tubes. Characteristics of the inflation curves predicted by the model for the four considered shells are analysed and the critical values of the model parameters for exhibiting the limit-point instability are established. The application of the model to extant experimental datasets procured from studies across 19th to 21st century will be demonstrated, showing favourable agreement between the model and the experimental data. The capability of the model to capture the two characteristic instability phenomena in the inflation of rubber-like materials, namely the limit-point and inflation-jump instabilities, will be made evident from both the theoretical analysis and curve-fitting approaches presented in this study. A comparison with the predictions of the Gent model for the considered data is also demonstrated and is shown that our presented model provides improved fits. Given the simplicity of the model, its ability to fit a wide range of experimental data and capture both limit-point and inflation-jump instabilities, we propose the application of our model to the inflation of rubber-like materials.


1978 ◽  
Vol 100 (1) ◽  
pp. 20-24 ◽  
Author(s):  
R. H. Rand

A one-dimensional, steady-state, constant temperature model of diffusion and absorption of CO2 in the intercellular air spaces of a leaf is presented. The model includes two geometrically distinct regions of the leaf interior, corresponding to palisade and spongy mesophyll tissue, respectively. Sun, shade, and intermediate light leaves are modeled by varying the thicknesses of these two regions. Values of the geometric model parameters are obtained by comparing geometric properties of the model with experimental data of other investigators found from dissection of real leaves. The model provides a quantitative estimate of the extent to which the concentration of gaseous CO2 varies locally within the leaf interior.


Author(s):  
Jean Brunette ◽  
Rosaire Mongrain ◽  
Rosaire Mongrain ◽  
Adrian Ranga ◽  
Adrian Ranga ◽  
...  

Myocardial infarction, also known as a heart attack, is the single leading cause of death in North America. It results from the rupture of an atherosclerotic plaque, which occurs in response to both mechanical stress and inflammatory processes. In order to validate computational models of atherosclerotic coronary arteries, a novel technique for molding realistic compliant phantom featuring injection-molded inclusions and multiple layers has been developed. This transparent phantom allows for particle image velocimetry (PIV) flow analysis and can supply experimental data to validate computational fluid dynamics algorithms and hypothesis.


Author(s):  
Feng Zhou ◽  
Jianxin (Roger) Jiao

Traditional user experience (UX) models are mostly qualitative in terms of its measurement and structure. This paper proposes a quantitative UX model based on cumulative prospect theory. It takes a decision making perspective between two alternative design profiles. However, affective elements are well-known to have influence on human decision making, the prevailing computational models for analyzing and simulating human perception on UX are mainly cognition-based models. In order to incorporate both affective and cognitive factors in the decision making process, we manipulate the parameters involved in the cumulative prospect model to show the affective influence. Specifically, three different affective states are induced to shape the model parameters. A hierarchical Bayesian model with a technique called Markov chain Monte Carlo is used to estimate the parameters. A case study of aircraft cabin interior design is illustrated to show the proposed methodology.


2017 ◽  
Vol 14 (18) ◽  
pp. 4295-4314 ◽  
Author(s):  
Dan Lu ◽  
Daniel Ricciuto ◽  
Anthony Walker ◽  
Cosmin Safta ◽  
William Munger

Abstract. Calibration of terrestrial ecosystem models is important but challenging. Bayesian inference implemented by Markov chain Monte Carlo (MCMC) sampling provides a comprehensive framework to estimate model parameters and associated uncertainties using their posterior distributions. The effectiveness and efficiency of the method strongly depend on the MCMC algorithm used. In this work, a differential evolution adaptive Metropolis (DREAM) algorithm is used to estimate posterior distributions of 21 parameters for the data assimilation linked ecosystem carbon (DALEC) model using 14 years of daily net ecosystem exchange data collected at the Harvard Forest Environmental Measurement Site eddy-flux tower. The calibration of DREAM results in a better model fit and predictive performance compared to the popular adaptive Metropolis (AM) scheme. Moreover, DREAM indicates that two parameters controlling autumn phenology have multiple modes in their posterior distributions while AM only identifies one mode. The application suggests that DREAM is very suitable to calibrate complex terrestrial ecosystem models, where the uncertain parameter size is usually large and existence of local optima is always a concern. In addition, this effort justifies the assumptions of the error model used in Bayesian calibration according to the residual analysis. The result indicates that a heteroscedastic, correlated, Gaussian error model is appropriate for the problem, and the consequent constructed likelihood function can alleviate the underestimation of parameter uncertainty that is usually caused by using uncorrelated error models.


Sign in / Sign up

Export Citation Format

Share Document