scholarly journals DATA ASSIMILATION APPLIED TO PRESSURISED WATER REACTORS

2021 ◽  
Vol 247 ◽  
pp. 09020
Author(s):  
J.R Dixon ◽  
B.A. Lindley ◽  
T. Taylor ◽  
G.T. Parks

Best estimate plus uncertainty is the leading methodology to validate existing safety margins. It remains a challenge to develop and license these approaches, in part due to the high dimensionality of system codes. Uncertainty quantification is an active area of research to develop appropriate methods for propagating uncertainties, offering greater scientific reason, dimensionality reduction and minimising reliance on expert judgement. Inverse uncertainty quantification is required to infer a best estimate back on the input parameters and reduce the uncertainties, but it is challenging to capture the full covariance and sensitivity matrices. Bayesian inverse strategies remain attractive due to their predictive modelling and reduced uncertainty capabilities, leading to dramatic model improvements and validation of experiments. This paper uses state-of-the-art data assimilation techniques to obtain a best estimate of parameters critical to plant safety. Data assimilation can combine computational, benchmark and experimental measurements, propagate sparse covariance and sensitivity matrices, treat non-linear applications and accommodate discrepancies. The methodology is further demonstrated through application to hot zero power tests in a pressurised water reactor (PWR) performed using the BEAVRS benchmark with Latin hypercube sampling of reactor parameters to determine responses. WIMS 11 (dv23) and PANTHER (V.5:6:4) are used as the coupled neutronics and thermal-hydraulics codes; both are used extensively to model PWRs. Results demonstrate updated best estimate parameters and reduced uncertainties, with comparisons between posterior distributions generated using maximum entropy principle and cost functional minimisation techniques illustrated in recent conferences. Future work will improve the Bayesian inverse framework with the introduction of higher-order sensitivities.

Author(s):  
Godlove Wanki ◽  
Stephen Ekwaro-Osire ◽  
João Paulo Dias ◽  
Americo Cunha

Abstract The advent of state-of-the-art additive manufacturing (AM) processes has facilitated the manufacturing of complex orthopedic metallic implants such as femoral stems with porous portions based on lattice structures. These struts often have rough and not smooth textured surfaces, for which the irregularities may influence mechanical properties. To make robust predictions about the behavior of this kind of system, the variability effect of its parameters on the stem stiffness must be considered in the processes of modeling and design of porous femoral stems. Also, to improve the credibility of computational models used for hip implant analysis, which involves numerous uncertainties, there is a need for rigorous uncertainty quantification (UQ) framework for proper model assessment following a credible-modeling standard. This work proposes a UQ framework in the presence of sparsely characterized input parameters using the maximum entropy principle for analyzing a femoral stem implant model and thus to clarify how uncertainties impact the key properties of a porous femoral stem. In this study, uncertainties in the strut thickness, pore size, Young's modulus, and external forcing are considered. The UQ framework is validated using experimental results available from literature, following the guidelines set in an ASME standard.


Acta Numerica ◽  
2018 ◽  
Vol 27 ◽  
pp. 353-450 ◽  
Author(s):  
J. Tinsley Oden

The use of computational models and simulations to predict events that take place in our physical universe, or to predict the behaviour of engineered systems, has significantly advanced the pace of scientific discovery and the creation of new technologies for the benefit of humankind over recent decades, at least up to a point. That ‘point’ in recent history occurred around the time that the scientific community began to realize that true predictive science must deal with many formidable obstacles, including the determination of the reliability of the models in the presence of many uncertainties. To develop meaningful predictions one needs relevant data, itself possessing uncertainty due to experimental noise; in addition, one must determine model parameters, and concomitantly, there is the overriding need to select and validate models given the data and the goals of the simulation.This article provides a broad overview of predictive computational science within the framework of what is often called the science of uncertainty quantification. The exposition is divided into three major parts. In Part 1, philosophical and statistical foundations of predictive science are developed within a Bayesian framework. There the case is made that the Bayesian framework provides, perhaps, a unique setting for handling all of the uncertainties encountered in scientific prediction. In Part 2, general frameworks and procedures for the calculation and validation of mathematical models of physical realities are given, all in a Bayesian setting. But beyond Bayes, an introduction to information theory, the maximum entropy principle, model sensitivity analysis and sampling methods such as MCMC are presented. In Part 3, the central problem of predictive computational science is addressed: the selection, adaptive control and validation of mathematical and computational models of complex systems. The Occam Plausibility Algorithm, OPAL, is introduced as a framework for model selection, calibration and validation. Applications to complex models of tumour growth are discussed.


1990 ◽  
Vol 27 (2) ◽  
pp. 303-313 ◽  
Author(s):  
Claudine Robert

The maximum entropy principle is used to model uncertainty by a maximum entropy distribution, subject to some appropriate linear constraints. We give an entropy concentration theorem (whose demonstration is based on large deviation techniques) which is a mathematical justification of this statistical modelling principle. Then we indicate how it can be used in artificial intelligence, and how relevant prior knowledge is provided by some classical descriptive statistical methods. It appears furthermore that the maximum entropy principle yields to a natural binding between descriptive methods and some statistical structures.


Author(s):  
KAI YAO ◽  
JINWU GAO ◽  
WEI DAI

Entropy is a measure of the uncertainty associated with a variable whose value cannot be exactly predicated. In uncertainty theory, it has been quantified so far by logarithmic entropy. However, logarithmic entropy sometimes fails to measure the uncertainty. This paper will propose another type of entropy named sine entropy as a supplement, and explore its properties. After that, the maximum entropy principle will be introduced, and the arc-cosine distributed variables will be proved to have the maximum sine entropy with given expected value and variance.


Sign in / Sign up

Export Citation Format

Share Document