scholarly journals Stochastic and epistemic uncertainty propagation in LCA

2013 ◽  
Vol 18 (7) ◽  
pp. 1393-1403 ◽  
Author(s):  
Julie Clavreul ◽  
Dominique Guyonnet ◽  
Davide Tonini ◽  
Thomas H. Christensen
Author(s):  
Alessandra Cuneo ◽  
Alberto Traverso ◽  
Shahrokh Shahpar

In engineering design, uncertainty is inevitable and can cause a significant deviation in the performance of a system. Uncertainty in input parameters can be categorized into two groups: aleatory and epistemic uncertainty. The work presented here is focused on aleatory uncertainty, which can cause natural, unpredictable and uncontrollable variations in performance of the system under study. Such uncertainty can be quantified using statistical methods, but the main obstacle is often the computational cost, because the representative model is typically highly non-linear and complex. Therefore, it is necessary to have a robust tool that can perform the uncertainty propagation with as few evaluations as possible. In the last few years, different methodologies for uncertainty propagation and quantification have been proposed. The focus of this study is to evaluate four different methods to demonstrate strengths and weaknesses of each approach. The first method considered is Monte Carlo simulation, a sampling method that can give high accuracy but needs a relatively large computational effort. The second method is Polynomial Chaos, an approximated method where the probabilistic parameters of the response function are modelled with orthogonal polynomials. The third method considered is Mid-range Approximation Method. This approach is based on the assembly of multiple meta-models into one model to perform optimization under uncertainty. The fourth method is the application of the first two methods not directly to the model but to a response surface representing the model of the simulation, to decrease computational cost. All these methods have been applied to a set of analytical test functions and engineering test cases. Relevant aspects of the engineering design and analysis such as high number of stochastic variables and optimised design problem with and without stochastic design parameters were assessed. Polynomial Chaos emerges as the most promising methodology, and was then applied to a turbomachinery test case based on a thermal analysis of a high-pressure turbine disk.


2011 ◽  
Vol 133 (2) ◽  
Author(s):  
Kais Zaman ◽  
Mark McDonald ◽  
Sankaran Mahadevan

This paper develops and illustrates a probabilistic approach for uncertainty representation and propagation in system analysis, when the information on the uncertain input variables and/or their distribution parameters may be available as either probability distributions or simply intervals (single or multiple). A unique aggregation technique is used to combine multiple interval data and to compute rigorous bounds on the system response cumulative distribution function. The uncertainty described by interval data is represented through a flexible family of probability distributions. Conversion of interval data to a probabilistic format enables the use of computationally efficient methods for probabilistic uncertainty propagation. Two methods are explored for the implementation of the proposed approach, based on (1) sampling and (2) optimization. The sampling-based strategy is more expensive and tends to underestimate the output bounds. The optimization-based methodology improves both aspects. The proposed methods are used to develop new solutions to challenge problems posed by the Sandia epistemic uncertainty workshop (Oberkampf et al., 2004, “Challenge Problems: Uncertainty in System Response Given Uncertain Parameters,” Reliab. Eng. Syst. Saf., 85, pp. 11–19). Results for the challenge problems are compared with earlier solutions.


2017 ◽  
Author(s):  
Matan Avital ◽  
Michael Davis ◽  
Ory Dor ◽  
Ronnie Kamai

Abstract. We present a full PSHA sensitivity analysis for two sites in southern Israel – one in the near-field of a major fault system and one farther away. The PSHA analysis is conducted for alternative source representations, using alternative model parameters for the main seismic sources, such as slip-rate and Mmax, among others. The analysis also considers the effect of the Ground-Motion Prediction Equation (GMPE) on the hazard results. In this way, the two types of epistemic uncertainty – modelling uncertainty and parametric uncertainty are treated and addressed. We quantify the uncertainty propagation by testing its influence of the final calculated hazard, such that the controlling knowledge gaps are identified and can be treated in future studies. We find that current practice in Israel, as represented by the most current version of the building code grossly underestimates the hazard, due to a combination of factors, including source definitions as well as the GMPE used for analysis.


Author(s):  
NICOLA PEDRONI ◽  
ENRICO ZIO

Risk analysis models describing aleatory (i.e., random) events contain parameters (e.g., probabilities, failure rates, …) that are epistemically-uncertain, i.e., known with poor precision. Whereas aleatory uncertainty is always described by probability distributions, epistemic uncertainty may be represented in different ways (e.g., probabilistic or possibilistic), depending on the information and data available. The work presented in this paper addresses the issue of accounting for (in)dependence relationships between epistemically-uncertain parameters. When a probabilistic representation of epistemic uncertainty is considered, uncertainty propagation is carried out by a two-dimensional (or double) Monte Carlo (MC) simulation approach; instead, when possibility distributions are used, two approaches are undertaken: the hybrid MC and Fuzzy Interval Analysis (FIA) method and the MC-based Dempster-Shafer (DS) approach employing Independent Random Sets (IRSs). The objectives are: i) studying the effects of (in)dependence between the epistemically-uncertain parameters of the aleatory probability distributions (when a probabilistic/possibilistic representation of epistemic uncertainty is adopted) and ii) studying the effect of the probabilistic/possibilistic representation of epistemic uncertainty (when the state of dependence between the epistemic parameters is defined). The Dependency Bound Convolution (DBC) approach is then undertaken within a hierarchical setting of hybrid (probabilistic and possibilistic) uncertainty propagation, in order to account for all kinds of (possibly unknown) dependences between the random variables. The analyses are carried out with reference to two toy examples, built in such a way to allow performing a fair quantitative comparison between the methods, and evaluating their rationale and appropriateness in relation to risk analysis.


2021 ◽  
Vol 896 (1) ◽  
pp. 012035
Author(s):  
M Bougofa ◽  
A Bouafia ◽  
A Baziz ◽  
S Aberkane ◽  
R Kharzi ◽  
...  

Abstract Probabilistic modeling is widely used in industrial practices, particularly for assessing complex systems’ safety, risk analysis, and reliability. Conventional risk analysis methodologies generally have a limited ability to deal with dependence, failure behavior, and epistemic uncertainty such as parameter uncertainty. This work proposes a risk-based reliability assessment approach using a dynamic evidential network (DEN). The proposed model integrates Dempster-Shafer theory (DST) for describing parameter uncertainty with a dynamic Bayesian network (DBN) for dependency representation and multi-state system reliability. This approach treats uncertainty propagation across conditional belief mass tables (CBMT). According to the results acquired in an interval, it is possible to analyze the risk like interval theory, and ignoring this uncertainty may lead to prejudiced results. The epistemic uncertainty should be adequately defined before performing the risk analysis. A case study of a level control system is used to highlight the methodology’s ability to capture dynamic changes in the process, uncertainty modeling, and sensitivity analysis that can serve decision making.


Author(s):  
Jeremy Rohmer

Abstract The treatment of uncertainty using extra-probabilistic approaches, like intervals or p-boxes, allows for a clear separation between epistemic uncertainty and randomness in the results of risk assessments. This can take the form of an interval of failure probabilities; the interval width W being an indicator of “what is unknown.” In some situations, W is too large to be informative. To overcome this problem, we propose to reverse the usual chain of treatment by starting with the targeted value of W that is acceptable to support the decision-making, and to quantify the necessary reduction in the input p-boxes that allows achieving it. In this view, we assess the feasibility of this procedure using two case studies (risk of dike failure, and risk of rupture of a frame structure subjected to lateral loads). By making the link with the estimation of excursion sets (i.e., the set of points where a function takes values below some prescribed threshold), we propose to alleviate the computational burden of the procedure by relying on the combination of Gaussian process (GP) metamodels and sequential design of computer experiments. The considered test cases show that the estimates can be achieved with only a few tens of calls to the computationally intensive algorithm for mixed aleatory/epistemic uncertainty propagation.


2018 ◽  
Vol 337 ◽  
pp. 67-86 ◽  
Author(s):  
Marco Gribaudo ◽  
Riccardo Pinciroli ◽  
Kishor Trivedi

Author(s):  
Xiaochao Qian ◽  
Wei Li ◽  
Ming Yang

Model calibration is the procedure that adjusts the unknown parameters in order to fit the model to experimental data and improve predictive capability. However, it is difficult to implement the procedure because of the aleatory uncertainty. In this paper, a new method of model calibration based on uncertainty propagation is investigated. The calibration process is described as an optimization problem. A two-stage nested uncertainty propagation method is proposed to resolve this problem. Monte Carlo Simulation method is applied for the inner loop to propagate the aleatory uncertainty. Optimization method is applied for the outer loop to propagate the epistemic uncertainty. The optimization objective function is the consistency between the result of the inner loop and the experimental data. Thus, different consistency measurement methods for unary output and multivariate outputs are proposed as the optimization objective function. Finally, the thermal challenge problem is given to validate the reasonableness and effectiveness of the proposed method.


Sign in / Sign up

Export Citation Format

Share Document