scholarly journals Towards Uncertainty Quantification of LES and URANS for the Buoyancy-Driven Mixing Process between Two Miscible Fluids—Differentially Heated Cavity of Aspect Ratio 4

Fluids ◽  
2021 ◽  
Vol 6 (4) ◽  
pp. 161
Author(s):  
Philipp J. Wenig ◽  
Ruiyun Ji ◽  
Stephan Kelm ◽  
Markus Klein

Numerical simulations are subject to uncertainties due to the imprecise knowledge of physical properties, model parameters, as well as initial and boundary conditions. The assessment of these uncertainties is required for some applications. In the field of Computational Fluid Dynamics (CFD), the reliable prediction of hydrogen distribution and pressure build-up in nuclear reactor containment after a severe reactor accident is a representative application where the assessment of these uncertainties is of essential importance. The inital and boundary conditions that significantly influence the present buoyancy-driven flow are subject to uncertainties. Therefore, the aim is to investigate the propagation of uncertainties in input parameters to the results variables. As a basis for the examination of a representative reactor test containment, the investigations are initially carried out using the Differentially Heated Cavity (DHC) of aspect ratio 4 with Ra=2×109 as a test case from the literature. This allows for gradual method development for guidelines to quantify the uncertainty of natural convection flows in large-scale industrial applications. A dual approach is applied, in which Large Eddy Simulation (LES) is used as reference for the Unsteady Reynolds-Averaged Navier–Stokes (URANS) computations. A methodology for the uncertainty quantification in engineering applications with a preceding mesh convergence study and sensitivity analysis is presented. By taking the LES as a reference, the results indicate that URANS is able to predict the underlying mixing process at Ra=2×109 and the variability of the result variables due to parameter uncertainties.

Author(s):  
Georg A. Mensah ◽  
Luca Magri ◽  
Jonas P. Moeck

Thermoacoustic instabilities are a major threat for modern gas turbines. Frequency-domain-based stability methods, such as network models and Helmholtz solvers, are common design tools because they are fast compared to compressible flow computations. They result in an eigenvalue problem, which is nonlinear with respect to the eigenvalue. Thus, the influence of the relevant parameters on mode stability is only given implicitly. Small changes in some model parameters, may have a great impact on stability. The assessment of how parameter uncertainties propagate to system stability is therefore crucial for safe gas turbine operation. This question is addressed by uncertainty quantification. A common strategy for uncertainty quantification in thermoacoustics is risk factor analysis. One general challenge regarding uncertainty quantification is the sheer number of uncertain parameter combinations to be quantified. For instance, uncertain parameters in an annular combustor might be the equivalence ratio, convection times, geometrical parameters, boundary impedances, flame response model parameters, etc. A new and fast way to obtain algebraic parameter models in order to tackle the implicit nature of the problem is using adjoint perturbation theory. This paper aims to further utilize adjoint methods for the quantification of uncertainties. This analytical method avoids the usual random Monte Carlo (MC) simulations, making it particularly attractive for industrial purposes. Using network models and the open-source Helmholtz solver PyHoltz, it is also discussed how to apply the method with standard modeling techniques. The theory is exemplified based on a simple ducted flame and a combustor of EM2C laboratory for which experimental data are available.


2020 ◽  
Author(s):  
Jonas Sukys ◽  
Marco Bacci

<div> <div>SPUX (Scalable Package for Uncertainty Quantification in "X") is a modular framework for Bayesian inference and uncertainty quantification. The SPUX framework aims at harnessing high performance scientific computing to tackle complex aquatic dynamical systems rich in intrinsic uncertainties,</div> <div>such as ecological ecosystems, hydrological catchments, lake dynamics, subsurface flows, urban floods, etc. The challenging task of quantifying input, output and/or parameter uncertainties in such stochastic models is tackled using Bayesian inference techniques, where numerical sampling and filtering algorithms assimilate prior expert knowledge and available experimental data. The SPUX framework greatly simplifies uncertainty quantification for realistic computationally costly models and provides an accessible, modular, portable, scalable, interpretable and reproducible scientific workflow. To achieve this, SPUX can be coupled to any serial or parallel model written in any programming language (e.g. Python, R, C/C++, Fortran, Java), can be installed either on a laptop or on a parallel cluster, and has built-in support for automatic reports, including algorithmic and computational performance metrics. I will present key SPUX concepts using a simple random walk example, and showcase recent realistic applications for catchment and lake models. In particular, uncertainties in model parameters, meteorological inputs, and data observation processes are inferred by assimilating available in-situ and remotely sensed datasets.</div> </div>


2021 ◽  
Author(s):  
Bruno V Rego ◽  
Dar Weiss ◽  
Matthew R Bersi ◽  
Jay D Humphrey

Quantitative estimation of local mechanical properties remains critically important in the ongoing effort to elucidate how blood vessels establish, maintain, or lose mechanical homeostasis. Recent advances based on panoramic digital image correlation (pDIC) have made high-fidelity 3D reconstructions of small-animal (e.g., murine) vessels possible when imaged in a variety of quasi-statically loaded configurations. While we have previously developed and validated inverse modeling approaches to translate pDIC-measured surface deformations into biomechanical metrics of interest, our workflow did not heretofore include a methodology to quantify uncertainties associated with local point estimates of mechanical properties. This limitation has compromised our ability to infer biomechanical properties on a subject-specific basis, such as whether stiffness differs significantly between multiple material locations on the same vessel or whether stiffness differs significantly between multiple vessels at a corresponding material location. In the present study, we have integrated a novel uncertainty quantification and propagation pipeline within our inverse modeling approach, relying on empirical and analytic Bayesian techniques. To demonstrate the approach, we present illustrative results for the ascending thoracic aorta from three mouse models, quantifying uncertainties in constitutive model parameters as well as circumferential and axial tangent stiffness. Our extended workflow not only allows parameter uncertainties to be systematically reported, but also facilitates both subject-specific and group-level statistical analyses of the mechanics of the vessel wall.


2020 ◽  
Author(s):  
Lucie Pheulpin ◽  
Vito Bacchi

<p>Hydraulic models are increasingly used to assess the flooding hazard. However, all numerical models are affected by uncertainties, related to model parameters, which can be quantified through Uncertainty Quantification (UQ) and Global Sensitivity Analysis (GSA). In traditional methods of UQ and GSA, the input parameters of the numerical models are considered to be independent which is actually rarely the case. The objective of this work is to proceed with UQ and GSA methods considering dependent inputs and comparing different methodologies. At our knowledge, there is no such application in the field of 2D hydraulic modelling.</p><p>At first the uncertain parameters of the hydraulic model are classified in groups of dependent parameters. Within this aim, it is then necessary to define the copulas that better represent these groups. Finally UQ and GSA based on copulas are performed. The proposed methodology is applied to the large scale 2D hydraulic model of the Loire River. However, as the model computation is high time-consuming, we used a meta-model instead of the initial model. We compared the results coming from the traditional methods of UQ and GSA (<em>i.e.</em> without taking into account the dependencies between inputs) and the ones coming from the new methods based on copulas. The results show that the dependence between inputs should not always be neglected in UQ and GSA.</p>


2021 ◽  
Vol 247 ◽  
pp. 20005
Author(s):  
Dan G. Cacuci

This invited presentation summarizes new methodologies developed by the author for performing high-order sensitivity analysis, uncertainty quantification and predictive modeling. The presentation commences by summarizing the newly developed 3rd-Order Adjoint Sensitivity Analysis Methodology (3rd-ASAM) for linear systems, which overcomes the “curse of dimensionality” for sensitivity analysis and uncertainty quantification of a large variety of model responses of interest in reactor physics systems. The use of the exact expressions of the 2nd-, and 3rd-order sensitivities computed using the 3rd-ASAM is subsequently illustrated by presenting 3rd-order formulas for the first three cumulants of the response distribution, for quantifying response uncertainties (covariance, skewness) stemming from model parameter uncertainties. The use of the 1st-, 2nd-, and 3rd-order sensitivities together with the formulas for the first three cumulants of the response distribution are subsequently used in the newly developed 2nd/3rd-BERRU-PM (“Second/Third-Order Best-Estimated Results with Reduced Uncertainties Predictive Modeling”), which aims at overcoming the curse of dimensionality in predictive modeling. The 2nd/3rd-BERRU-PM uses the maximum entropy principle to eliminate the need for introducing a subjective user-defined “cost functional quantifying the discrepancies between measurements and computations.” By utilizing the 1st-, 2nd- and 3rd-order response sensitivities to combine experimental and computational information in the joint phase-space of responses and model parameters, the 2nd/3rd-BERRU-PM generalizes the current data adjustment/assimilation methodologies. Even though all of the 2nd- and 3rd-order are comprised in the mathematical framework of the 2nd/3rd-BERRU-PM formalism, the computations underlying the 2nd/3rd-BERRU-PM require the inversion of a single matrix of dimensions equal to the number of considered responses, thus overcoming the curse of dimensionality which would affect the inversion of hessian and higher-order matrices in the parameter space.


2021 ◽  
Author(s):  
Dan Lunt ◽  

<div> <div> <div>We present results from an ensemble of eight climate models, each of which has carried out simulations of theearly Eocene climate optimum (EECO, ∼50 million years ago). These simulations have been carried out in the framework of DeepMIP (www.deepmip.org), and as such all models have been configured with the same paleogeographic and vegetation boundary conditions. The results indicate that these non-CO<sub>2</sub> boundary conditions contribute between 3 and 5<sup>o</sup>C to Eocene warmth. Compared to results from previous studies, the DeepMIP simulations show in general reduced spread of global mean surface temperature response across the ensemble for a given atmospheric CO<sub>2</sub> concentration, and an increased climate sensitivity on average. An energy balance analysis of the model ensemble indicates that global mean warming in the Eocene compared with preindustrial arises mostly from decreases in emissivity due to the elevated CO<sub>2</sub> (and associated water vapour and long-wave cloud feedbacks), whereas in terms of the meridional temperature gradient, the reduction in the Eocene is primarily due to emissivity and albedo changes due to the non-CO<sub>2</sub> boundary conditions (i.e. removal of the Antarctic ice sheet and changes in vegetation). Three of the models (CESM, GFDL, and NorESM) show results that are consistent with the proxies in terms of global mean temperature, meridional SST gradient, and CO<sub>2</sub>, without prescribing changes to model parameters. In addition, many of the models agree well with the first-order spatial patterns in the SST proxies. However, at a more regional scale the models lack skill. In particular, in the southwest Pacific, the modelled anomalies are substantially less than indicated by the proxies; here, modelled continental surface air temperature anomalies are more consistent with surface air temperature proxies, implying a possible inconsistency between marine and terrestrial temperatures in either the proxiesor models in this region. Our aim is that the documentation of the large scale features and model-data comparison presented herein will pave the way to further studies that explore aspects of the model simulations in more detail, for example the ocean circulation, hydrological cycle, and modes of variability; and encourage sensitivity studies to aspects such as paleogeography, orbital configuration, and aerosols</div> </div> </div>


2020 ◽  
Vol 10 (16) ◽  
pp. 5526
Author(s):  
Chengxin Feng ◽  
Bin Tian ◽  
Xiaochun Lu ◽  
Michael Beer ◽  
Matteo Broggi ◽  
...  

It is important to determine the soil–water characteristic curve (SWCC) for analyzing landslide seepage under varying hydrodynamic conditions. However, the SWCC exhibits high uncertainty due to the variability inherent in soil. To this end, a Bayesian updating framework based on the experimental data was developed to investigate the uncertainty of the SWCC parameters in this study. The objectives of this research were to quantify the uncertainty embedded within the SWCC and determine the critical factors affecting an unsaturated soil landslide under hydrodynamic conditions. For this purpose, a large-scale landslide experiment was conducted, and the monitored water content data were collected. Steady-state seepage analysis was carried out using the finite element method (FEM) to simulate the slope behavior during water level change. In the proposed framework, the parameters of the SWCC model were treated as random variables and parameter uncertainties were evaluated using the Bayesian approach based on the Markov chain Monte Carlo (MCMC) method. Observed data from large-scale landslide experiments were used to calculate the posterior information of SWCC parameters. Then, 95% confidence intervals for the model parameters of the SWCC were derived. The results show that the Bayesian updating method is feasible for the monitoring of data of large-scale landslide model experiments. The establishment of an artificial neural network (ANN) surrogate model in the Bayesian updating process can greatly improve the efficiency of Bayesian model updating.


Author(s):  
William A. Lane ◽  
Curtis Storlie ◽  
Christopher Montgomery ◽  
Emily M. Ryan

As the effects of climate change continue to rise with increasing carbon dioxide emission rates, it is imperative that we develop an efficient method for carbon capture. This paper outlines the framework used to break down a large, complex carbon capture system into smaller unit problems for model validation, and uncertainty quantification. We use this framework to investigate the uncertainty and sensitivity of the hydrodynamics of a bubbling fluidized bed. Using the open-source computational fluid dynamics code MFIX we simulate a bubbling fluidized bed with an immersed horizontal tube bank. Mesh resolution and statistical steady state studies are conducted to identify the optimal operating conditions. The preliminary results show good agreement with experimental data from literature. Employing statistical sampling and analysis techniques we designed a set of simulations to quantify the sensitivity of the model to model parameters that are difficult to measure, including: coefficients of restitution, friction angles, packed bed void fraction, and drag models. Initial sensitivity analysis results indicate that no parameters may be omitted. Further uncertainty quantification analysis is underway to investigate and quantify the effects of model parameters on the simulations results.


2019 ◽  
Author(s):  
Rohitash Chandra ◽  
Danial Azam ◽  
Arpit Kapoor ◽  
R. Dietmar Mulller

Abstract. The complex and computationally expensive features of the forward landscape and sedimentary basin evolution models pose a major challenge in the development of efficient inference and optimization methods. Bayesian inference provides a methodology for estimation and uncertainty quantification of free model parameters. In our previous work, parallel tempering Bayeslands was developed as a framework for parameter estimation and uncertainty quantification for the landscape and basin evolution modelling software Badlands. Parallel tempering Bayeslands features high-performance computing with dozens of processing cores running in parallel to enhance computational efficiency. Although parallel computing is used, the procedure remains computationally challenging since thousands of samples need to be drawn and evaluated. In large-scale landscape and basin evolution problems, a single model evaluation can take from several minutes to hours, and in certain cases, even days. Surrogate-assisted optimization has been with successfully applied to a number of engineering problems This motivates its use in optimisation and inference methods suited for complex models in geology and geophysics. Surrogates can speed up parallel tempering Bayeslands by developing computationally inexpensive surrogates to mimic expensive models. In this paper, we present an application of surrogate-assisted parallel tempering where that surrogate mimics a landscape evolution model including erosion, sediment transport and deposition, by estimating the likelihood function that is given by the model. We employ a machine learning model as a surrogate that learns from the samples generated by the parallel tempering algorithm and the likelihood from the model. The entire framework is developed in a parallel computing infrastructure to take advantage of parallelization. The results show that the proposed methodology is effective in lowering the overall computational cost significantly while retaining the quality of solutions.


2020 ◽  
Vol 13 (7) ◽  
pp. 2959-2979
Author(s):  
Rohitash Chandra ◽  
Danial Azam ◽  
Arpit Kapoor ◽  
R. Dietmar Müller

Abstract. The complex and computationally expensive nature of landscape evolution models poses significant challenges to the inference and optimization of unknown model parameters. Bayesian inference provides a methodology for estimation and uncertainty quantification of unknown model parameters. In our previous work, we developed parallel tempering Bayeslands as a framework for parameter estimation and uncertainty quantification for the Badlands landscape evolution model. Parallel tempering Bayeslands features high-performance computing that can feature dozens of processing cores running in parallel to enhance computational efficiency. Nevertheless, the procedure remains computationally challenging since thousands of samples need to be drawn and evaluated. In large-scale landscape evolution problems, a single model evaluation can take from several minutes to hours and in some instances, even days or weeks. Surrogate-assisted optimization has been used for several computationally expensive engineering problems which motivate its use in optimization and inference of complex geoscientific models. The use of surrogate models can speed up parallel tempering Bayeslands by developing computationally inexpensive models to mimic expensive ones. In this paper, we apply surrogate-assisted parallel tempering where the surrogate mimics a landscape evolution model by estimating the likelihood function from the model. We employ a neural-network-based surrogate model that learns from the history of samples generated. The entire framework is developed in a parallel computing infrastructure to take advantage of parallelism. The results show that the proposed methodology is effective in lowering the computational cost significantly while retaining the quality of model predictions.


Sign in / Sign up

Export Citation Format

Share Document