scholarly journals Evidential Model Validation under Epistemic Uncertainty

2018 ◽  
Vol 2018 ◽  
pp. 1-11 ◽  
Author(s):  
Wei Deng ◽  
Xi Lu ◽  
Yong Deng

This paper proposes evidence theory based methods to both quantify the epistemic uncertainty and validate computational model. Three types of epistemic uncertainty concerning input model data, that is, sparse points, intervals, and probability distributions with uncertain parameters, are considered. Through the proposed methods, the given data will be described as corresponding probability distributions for uncertainty propagation in the computational model, thus, for the model validation. The proposed evidential model validation method is inspired by the idea of Bayesian hypothesis testing and Bayes factor, which compares the model predictions with the observed experimental data so as to assess the predictive capability of the model and help the decision making of model acceptance. Developed by the idea of Bayes factor, the frame of discernment of Dempster-Shafer evidence theory is constituted and the basic probability assignment (BPA) is determined. Because the proposed validation method is evidence based, the robustness of the result can be guaranteed, and the most evidence-supported hypothesis about the model testing will be favored by the BPA. The validity of proposed methods is illustrated through a numerical example.

Author(s):  
Sofiia Alpert

The process of solution of different practical and ecological problems, using hyperspectral satellite images usually includes a procedure of classification. Classification is one of the most difficult and important procedures. Some image classification methods were considered and analyzed in this work. These methods are based on the theory of evidence. Evidence theory can simulate uncertainty and process imprecise and incomplete information. It were considered such combination rules in this paper: “mixing” combination rule (or averaging), convolutive x-averaging (or c-averaging) and Smet’s combination rule. It was shown, that these methods can process the data from multiple sources or spectral bands, that provide different assessments for the same hypotheses. It was noted, that the purpose of aggregation of information is to simplify data, whether the data is coming from multiple sources or different spectral bands. It was shown, that Smet’s rule is unnormalized version of Dempster rule, that applied in Smet’s Transferable Belief Model. It also processes imprecise and incomplete data. Smet’s combination rule entails a slightly different formulation of Dempster-Shafer theory. Mixing (or averaging) rule was considered in this paper too. It is the averaging operation that is used for probability distributions. This rule uses basic probability assignments from different sources (spectral bands) and weighs assigned according to the reliability of the sources. Convolutive x-averaging (or c-averaging) rule was considered in this paper too. This combination rule is a generalization of the average for scalar numbers. This rule is commutative and not associative. It also was noted, that convolutive x-averaging (c-averaging) rule can include any number of basic probability assignments. It were also considered examples, where these proposed combination rules were used. Mixing, convolutive x-averaging (c-averaging) rule and Smet’s combination rule can be applied for analysis of hyperspectral satellite images, in remote searching for minerals and oil, solving different environmental and thematic problems.


2011 ◽  
Vol 133 (2) ◽  
Author(s):  
Kais Zaman ◽  
Mark McDonald ◽  
Sankaran Mahadevan

This paper develops and illustrates a probabilistic approach for uncertainty representation and propagation in system analysis, when the information on the uncertain input variables and/or their distribution parameters may be available as either probability distributions or simply intervals (single or multiple). A unique aggregation technique is used to combine multiple interval data and to compute rigorous bounds on the system response cumulative distribution function. The uncertainty described by interval data is represented through a flexible family of probability distributions. Conversion of interval data to a probabilistic format enables the use of computationally efficient methods for probabilistic uncertainty propagation. Two methods are explored for the implementation of the proposed approach, based on (1) sampling and (2) optimization. The sampling-based strategy is more expensive and tends to underestimate the output bounds. The optimization-based methodology improves both aspects. The proposed methods are used to develop new solutions to challenge problems posed by the Sandia epistemic uncertainty workshop (Oberkampf et al., 2004, “Challenge Problems: Uncertainty in System Response Given Uncertain Parameters,” Reliab. Eng. Syst. Saf., 85, pp. 11–19). Results for the challenge problems are compared with earlier solutions.


Author(s):  
Bin Zhou ◽  
Bin Zi ◽  
Yishang Zeng ◽  
Weidong Zhu

Abstract An evidence-theory-based interval perturbation method (ETIPM) and an evidence-theory-based subinterval perturbation method (ETSPM) are presented for the kinematic uncertainty analysis of a dual cranes system (DCS) with epistemic uncertainty. A multiple evidence variable (MEV) model that consists of evidence variables with focal elements (FEs) and basic probability assignments (BPAs) is constructed. Based on the evidence theory, an evidence-based kinematic equilibrium equation with the MEV model is equivalently transformed to several interval equations. In the ETIPM, the bounds of the luffing angular vector (LAV) with respect to every joint FE are calculated by integrating the first-order Taylor series expansion and interval algorithm. The bounds of the expectation and variance of the LAV and corresponding BPAs are calculated by using the evidence-based uncertainty quantification method. In the ETSPM, the subinterval perturbation method is introduced to decompose original FE into several small subintervals. By comparing results yielded by the ETIPM and ETSPM with those by the evidence theory-based Monte Carlo method, numerical examples show that the accuracy and computational time of the ETSPM are higher than those of the ETIPM, and the accuracy of the ETIPM and ETSPM can be significantly improved with the increase of the number of FEs and subintervals.


Author(s):  
NICOLA PEDRONI ◽  
ENRICO ZIO

Risk analysis models describing aleatory (i.e., random) events contain parameters (e.g., probabilities, failure rates, …) that are epistemically-uncertain, i.e., known with poor precision. Whereas aleatory uncertainty is always described by probability distributions, epistemic uncertainty may be represented in different ways (e.g., probabilistic or possibilistic), depending on the information and data available. The work presented in this paper addresses the issue of accounting for (in)dependence relationships between epistemically-uncertain parameters. When a probabilistic representation of epistemic uncertainty is considered, uncertainty propagation is carried out by a two-dimensional (or double) Monte Carlo (MC) simulation approach; instead, when possibility distributions are used, two approaches are undertaken: the hybrid MC and Fuzzy Interval Analysis (FIA) method and the MC-based Dempster-Shafer (DS) approach employing Independent Random Sets (IRSs). The objectives are: i) studying the effects of (in)dependence between the epistemically-uncertain parameters of the aleatory probability distributions (when a probabilistic/possibilistic representation of epistemic uncertainty is adopted) and ii) studying the effect of the probabilistic/possibilistic representation of epistemic uncertainty (when the state of dependence between the epistemic parameters is defined). The Dependency Bound Convolution (DBC) approach is then undertaken within a hierarchical setting of hybrid (probabilistic and possibilistic) uncertainty propagation, in order to account for all kinds of (possibly unknown) dependences between the random variables. The analyses are carried out with reference to two toy examples, built in such a way to allow performing a fair quantitative comparison between the methods, and evaluating their rationale and appropriateness in relation to risk analysis.


2021 ◽  
Author(s):  
Hong Feng Long ◽  
Zhen Ming Peng ◽  
Yong Deng

Abstract Applying geometry to the analysis and interpretation of basic probability assignment(BPA) is a unique research direction in evidence theory. Though the geometric representation of BPA has been proposed, the visualization method of BPA is still lack of sufficient research. In this paper, we propose a new BPA visualization method based on the vector representation of the BPA to illustrate the image of BPA directly. The basic point and the uncertain vectors can be obtained by the given BPA firstly, and then we connect these components to construct the image of BPA. Through the image of BPA, we can effectively analyze the interaction effect of focal elements in BPA, and observe the potential characteristics of BPA directly. Meanwhile, the geometric meanings of parameters in the vector representation of the BPA can be explained. Finally, the advantages and applications have been studied and discussed.


2012 ◽  
Vol 134 (10) ◽  
Author(s):  
Christian Gogu ◽  
Youchun Qiu ◽  
Stéphane Segonds ◽  
Christian Bes

Evidence theory is one of the approaches designed specifically for dealing with epistemic uncertainty. This type of uncertainty modeling is often useful at preliminary design stages where the uncertainty related to lack of knowledge is the highest. While multiple approaches for propagating epistemic uncertainty through one-dimensional functions have been proposed, propagation through functions having a multidimensional output that need to be considered at once received less attention. Such propagation is particularly important when the multiple function outputs are not independent, which frequently occurs in real world problems. The present paper proposes an approach for calculating belief and plausibility measures by uncertainty propagation through functions with multidimensional, nonindependent output by formulating the problem as one-dimensional optimization problems in spite of the multidimensionality of the output. A general formulation is first presented followed by two special cases where the multidimensional function is convex and where it is linear over each focal element. An analytical example first illustrates the importance of considering all the function outputs at once when these are not independent. Then, an application example to preliminary design of a propeller aircraft then illustrates the proposed algorithm for a convex function. An approximate solution found to be almost identical to the exact solution is also obtained for this problem by linearizing the previous convex function over each focal element.


2016 ◽  
Vol 2016 ◽  
pp. 1-11 ◽  
Author(s):  
Wen Jiang ◽  
Jun Zhan ◽  
Deyun Zhou ◽  
Xin Li

Dempster-Shafer evidence theory (D-S theory) has been widely used in many information fusion systems since it was proposed by Dempster and extended by Shafer. However, how to determine the basic probability assignment (BPA), which is the main and first step in D-S theory, is still an open issue, especially when the given environment is in an open world, which means the frame of discernment is incomplete. In this paper, a method to determine generalized basic probability assignment in an open world is proposed. Frame of discernment in an open world is established first, and then the triangular fuzzy number models to identify target in the proposed frame of discernment are established. Pessimistic strategy based on the differentiation degree between model and sample is defined to yield the BPAs for known targets. If the sum of all the BPAs of known targets is over one, then they will be normalized and the BPA of unknown target is assigned to0; otherwise the BPA of unknown target is equal to1minus the sum of all the known targets BPAs. IRIS classification examples illustrated the effectiveness of the proposed method.


2016 ◽  
Vol 2016 ◽  
pp. 1-5 ◽  
Author(s):  
Chaoyang Xie ◽  
Guijie Li

Quantification of Margins and Uncertainties (QMU) is a decision-support methodology for complex technical decisions centering on performance thresholds and associated margins for engineering systems. Uncertainty propagation is a key element in QMU process for structure reliability analysis at the presence of both aleatory uncertainty and epistemic uncertainty. In order to reduce the computational cost of Monte Carlo method, a mixed uncertainty propagation approach is proposed by integrated Kriging surrogate model under the framework of evidence theory for QMU analysis in this paper. The approach is demonstrated by a numerical example to show the effectiveness of the mixed uncertainty propagation method.


2014 ◽  
Vol 687-691 ◽  
pp. 1564-1567
Author(s):  
Liang Zhao ◽  
Zhan Ping Yang

This paper develops a model validation method in the case of the full scale tests for the system model are infeasible. The Bayesian network with uncertain conditional probability parameters is used to represent the relations between the large computational model and its smaller modules. The interval probability theory is adopted to extrapolate the posterior probability of the interested variable in the uncertain Bayesian network. An interval valued Bayes factor is obtained to be the metric for model validation.


Sign in / Sign up

Export Citation Format

Share Document