scholarly journals Research on application method of uncertainty quantification technology in equipment test identification

2021 ◽  
Vol 336 ◽  
pp. 02026
Author(s):  
Jiajia Wang ◽  
Hao Chen ◽  
Jing Ma ◽  
Tong Zhang

This paper introduces the concepts of equipment test qualification and uncertainty quantification, and the analysis framework and process of equipment test uncertainty quantification. It analyzes the data uncertainty, model uncertainty and environmental uncertainty, and studies the corresponding uncertainty quantification theory to provide technical reference for the application of uncertainty quantification technology in the field of test identification.

2021 ◽  
Vol 11 (14) ◽  
pp. 6499
Author(s):  
Matthias Frankl ◽  
Mathieu Hursin ◽  
Dimitri Rochman ◽  
Alexander Vasiliev ◽  
Hakim Ferroukhi

Presently, a criticality safety evaluation methodology for the final geological disposal of Swiss spent nuclear fuel is under development at the Paul Scherrer Institute in collaboration with the Swiss National Technical Competence Centre in the field of deep geological disposal of radioactive waste. This method in essence pursues a best estimate plus uncertainty approach and includes burnup credit. Burnup credit is applied by means of a computational scheme called BUCSS-R (Burnup Credit System for the Swiss Reactors–Repository case) which is complemented by the quantification of uncertainties from various sources. BUCSS-R consists in depletion, decay and criticality calculations with CASMO5, SERPENT2 and MCNP6, respectively, determining the keff eigenvalues of the disposal canister loaded with the Swiss spent nuclear fuel assemblies. However, the depletion calculation in the first and the criticality calculation in the third step, in particular, are subject to uncertainties in the nuclear data input. In previous studies, the effects of these nuclear data-related uncertainties on obtained keff values, stemming from each of the two steps, have been quantified independently. Both contributions to the overall uncertainty in the calculated keff values have, therefore, been considered as fully correlated leading to an overly conservative estimation of total uncertainties. This study presents a consistent approach eliminating the need to assume and take into account unrealistically strong correlations in the keff results. The nuclear data uncertainty quantification for both depletion and criticality calculation is now performed at once using one and the same set of perturbation factors for uncertainty propagation through the corresponding calculation steps of the evaluation method. The present results reveal the overestimation of nuclear data-related uncertainties by the previous approach, in particular for spent nuclear fuel with a high burn-up, and underline the importance of consistent nuclear data uncertainty quantification methods. However, only canister loadings with UO2 fuel assemblies are considered, not offering insights into potentially different trends in nuclear data-related uncertainties for mixed oxide fuel assemblies.


1997 ◽  
Vol 16 (4-5) ◽  
pp. 449-460 ◽  
Author(s):  
Ralph L. Kodell ◽  
David W. Gaylor

The uncertainties associated with extrapolating model-based cancer risks from high to low doses and animal-based cancer risks to humans are examined. It is argued that low-dose linear extrapolation based on statistical confidence limits calculated from animal data is designed to account for data uncertainty, model-selection uncertainty, and model-fitting instability. The intent is to err on the side of safety, that is, overstating rather than understating the true risk. The tendency toward conservatism in predicting human cancer risks from animal data based on linear extrapolation is confirmed by a real-data analysis of the various sources of uncertainty involved in extrapolating from animals to humans. Along with the tendency toward conservatism, a high degree of overall uncertainty in the interspecies extrapolation process is demonstrated. It is concluded that human cancer risk estimates based on animal data may underestimate the true risk by a factor of 10 or may overestimate that risk by a factor of 1,000.


Author(s):  
Yanjun Zhang ◽  
Tingting Xia ◽  
Mian Li

Abstract Various types of uncertainties, such as parameter uncertainty, model uncertainty, metamodeling uncertainty may lead to low robustness. Parameter uncertainty can be either epistemic or aleatory in physical systems, which have been widely represented by intervals and probability distributions respectively. Model uncertainty is formally defined as the difference between the true value of the real-world process and the code output of the simulation model at the same value of inputs. Additionally, metamodeling uncertainty is introduced due to the usage of metamodels. To reduce the effects of uncertainties, robust optimization (RO) algorithms have been developed to obtain solutions being not only optimal but also less sensitive to uncertainties. Based on how parameter uncertainty is modeled, there are two categories of RO approaches: interval-based and probability-based. In real-world engineering problems, both interval and probabilistic parameter uncertainties are likely to exist simultaneously in a single problem. However, few works have considered mixed interval and probabilistic parameter uncertainties together with other types of uncertainties. In this work, a general RO framework is proposed to deal with mixed interval and probabilistic parameter uncertainties, model uncertainty, and metamodeling uncertainty simultaneously in design optimization problems using the intervals-of-statistics approaches. The consideration of multiple types of uncertainties will improve the robustness of optimal designs and reduce the risk of inappropriate decision-making, low robustness and low reliability in engineering design. Two test examples are utilized to demonstrate the applicability and effectiveness of the proposed RO approach.


Author(s):  
Zhen Hu ◽  
Sankaran Mahadevan ◽  
Xiaoping Du

Limited data of stochastic load processes and system random variables result in uncertainty in the results of time-dependent reliability analysis. An uncertainty quantification (UQ) framework is developed in this paper for time-dependent reliability analysis in the presence of data uncertainty. The Bayesian approach is employed to model the epistemic uncertainty sources in random variables and stochastic processes. A straightforward formulation of UQ in time-dependent reliability analysis results in a double-loop implementation procedure, which is computationally expensive. This paper proposes an efficient method for the UQ of time-dependent reliability analysis by integrating the fast integration method and surrogate model method with time-dependent reliability analysis. A surrogate model is built first for the time-instantaneous conditional reliability index as a function of variables with imprecise parameters. For different realizations of the epistemic uncertainty, the associated time-instantaneous most probable points (MPPs) are then identified using the fast integration method based on the conditional reliability index surrogate without evaluating the original limit-state function. With the obtained time-instantaneous MPPs, uncertainty in the time-dependent reliability analysis is quantified. The effectiveness of the proposed method is demonstrated using a mathematical example and an engineering application example.


2013 ◽  
Vol 10 (87) ◽  
pp. 20130554 ◽  
Author(s):  
J. Grau-Moya ◽  
E. Hez ◽  
G. Pezzulo ◽  
D. A. Braun

Decision-makers have been shown to rely on probabilistic models for perception and action. However, these models can be incorrect or partially wrong in which case the decision-maker has to cope with model uncertainty. Model uncertainty has recently also been shown to be an important determinant of sensorimotor behaviour in humans that can lead to risk-sensitive deviations from Bayes optimal behaviour towards worst-case or best-case outcomes. Here, we investigate the effect of model uncertainty on cooperation in sensorimotor interactions similar to the stag-hunt game, where players develop models about the other player and decide between a pay-off-dominant cooperative solution and a risk-dominant, non-cooperative solution. In simulations, we show that players who allow for optimistic deviations from their opponent model are much more likely to converge to cooperative outcomes. We also implemented this agent model in a virtual reality environment, and let human subjects play against a virtual player. In this game, subjects' pay-offs were experienced as forces opposing their movements. During the experiment, we manipulated the risk sensitivity of the computer player and observed human responses. We found not only that humans adaptively changed their level of cooperation depending on the risk sensitivity of the computer player but also that their initial play exhibited characteristic risk-sensitive biases. Our results suggest that model uncertainty is an important determinant of cooperation in two-player sensorimotor interactions.


SPE Journal ◽  
2011 ◽  
Vol 16 (02) ◽  
pp. 429-439 ◽  
Author(s):  
Heng Li ◽  
Pallav Sarma ◽  
Dongxiao Zhang

Summary Reservoir modeling and simulation are subject to significant uncertainty, which usually arises from heterogeneity of the geological formation and deficiency of measured data. Uncertainty quantification, thus, plays an important role in reservoir simulation. In order to perform accurate uncertainty analysis, a large number of simulations are often required. However, it is usually prohibitive to do so because even a single simulation of practical large-scale simulation models may be quite time consuming. Therefore, efficient approaches for uncertainty quantification are a necessity. The experimental-design (ED) method is applied widely in the petroleum industry for assessing uncertainties in reservoir production and economic appraisal. However, a key disadvantage of this approach is that it does not take into account the full probability-density functions (PDFs) of the input random parameters consistently—that is, the full PDFs are not used for sampling and design but used only during post-processing, and there is an inherent assumption that the distributions of these parameters are uniform (during sampling), which is rarely the case in reality. In this paper, we propose an approach to deal with arbitrary input probability distributions using the probabilistic-collocation method (PCM). Orthogonal polynomials for arbitrary distributions are first constructed numerically, and then PCM is used for uncertainty propagation. As a result, PCM can be applied efficiently for any arbitrary numerical or analytical distribution of the input parameters. It can be shown that PCM provides optimal convergence rates for linear models, whereas no such guarantees are provided by ED. The approach is also applicable to discrete distributions. PCM and ED are compared on a few synthetic and realistic reservoir models. Different types of PDFs are considered for a number of reservoir parameters. Results indicate that, while the computational efforts are greatly reduced compared to Monte Carlo (MC) simulation, PCM is able to accurately quantify uncertainty of various reservoir performance parameters. Results also reveal that PCM is more robust, more accurate, and more efficient than ED for uncertainty analysis.


Author(s):  
Yanjun Zhang ◽  
Mian Li

Uncertainty is inevitable in engineering design. The existence of uncertainty may change the optimality and/or the feasibility of the obtained optimal solutions. In simulation-based engineering design, uncertainty could have various types of sources, such as parameter uncertainty, model uncertainty, and other random errors. To deal with uncertainty, robust optimization (RO) algorithms are developed to find solutions which are not only optimal but also robust with respect to uncertainty. Parameter uncertainty has been taken care of by various RO approaches. While model uncertainty has been ignored in majority of existing RO algorithms with the hypothesis that the simulation model used could represent the real physical system perfectly. In the authors’ earlier work, a RO framework was proposed to consider both parameter and model uncertainties using the Bayesian approach with Gaussian processes (GP), where metamodeling uncertainty introduced by GP modeling is ignored by assuming the constructed GP model is accurate enough with sufficient training samples. However, infinite samples are impossible for real applications due to prohibitive time and/or computational cost. In this work, a new RO framework is proposed to deal with both parameter and model uncertainties using GP models but only with limited samples. The compound effect of parameter, model, and metamodeling uncertainties is derived with the form of the compound mean and variance to formulate the proposed RO approach. The proposed RO approach will reduce the risk for the obtained robust optimal designs considering parameter and model uncertainties becoming non-optimal and/or infeasible due to insufficiency of samples for GP modeling. Two test examples with different degrees of complexity are utilized to demonstrate the applicability and effectiveness of the proposed approach.


Sign in / Sign up

Export Citation Format

Share Document