Multiscale Variability and Uncertainty Quantification Based on a Generalized Multiscale Markov Model

Author(s):  
Yan Wang

Variability is inherent randomness in systems, whereas uncertainty is due to lack of knowledge. In this paper, a generalized multiscale Markov (GMM) model is proposed to quantify variability and uncertainty simultaneously in multiscale system analysis. The GMM model is based on a new imprecise probability theory that has the form of generalized interval, which is a Kaucher or modal extension of classical set-based intervals to represent uncertainties. The properties of the new definitions of independence and Bayesian inference are studied. Based on a new Bayes’ rule with generalized intervals, three cross-scale validation approaches that incorporate variability and uncertainty propagation are also developed.

2011 ◽  
Vol 133 (3) ◽  
Author(s):  
Yan Wang

Variability is the inherent randomness in systems, whereas incertitude is due to lack of knowledge. In this paper, a generalized hidden Markov model (GHMM) is proposed to quantify aleatory and epistemic uncertainties simultaneously in multiscale system analysis. The GHMM is based on a new imprecise probability theory that has the form of generalized interval. The new interval probability resembles the precise probability and has a similar calculus structure. The proposed GHMM allows us to quantify cross-scale dependency and information loss between scales. Based on a generalized interval Bayes’ rule, three cross-scale information assimilation approaches that incorporate uncertainty propagation are also developed.


2020 ◽  
Vol 2020 ◽  
pp. 1-18
Author(s):  
Juan Zhang ◽  
Junping Yin ◽  
Ruili Wang

Since 2000, the research of uncertainty quantification (UQ) has been successfully applied in many fields and has been highly valued and strongly supported by academia and industry. This review firstly discusses the sources and the types of uncertainties and gives an overall discussion on the goal, practical significance, and basic framework of the research of UQ. Then, the core ideas and typical methods of several important UQ processes are introduced, including sensitivity analysis, uncertainty propagation, model calibration, Bayesian inference, experimental design, surrogate model, and model uncertainty analysis.


2017 ◽  
Author(s):  
Alexander Etz ◽  
Joachim Vandekerckhove

We introduce the fundamental tenets of Bayesian inference, which derive from two basic laws of probability theory. We cover the interpretation of probabilities, discrete and continuous versions of Bayes' rule, parameter estimation, and model comparison. Using seven worked examples, we illustrate these principles and set up some of the technical background for the rest of this special issue of Psychonomic Bulletin & Review. Supplemental material is available via https://osf.io/wskex/.


Author(s):  
Djamalddine Boumezerane

Abstract In this study, we use possibility distribution as a basis for parameter uncertainty quantification in one-dimensional consolidation problems. A Possibility distribution is the one-point coverage function of a random set and viewed as containing both partial ignorance and uncertainty. Vagueness and scarcity of information needed for characterizing the coefficient of consolidation in clay can be handled using possibility distributions. Possibility distributions can be constructed from existing data, or based on transformation of probability distributions. An attempt is made to set a systematic approach for estimating uncertainty propagation during the consolidation process. The measure of uncertainty is based on Klir's definition (1995). We make comparisons with results obtained from other approaches (probabilistic…) and discuss the importance of using possibility distributions in this type of problems.


Mathematics ◽  
2020 ◽  
Vol 8 (7) ◽  
pp. 1079 ◽  
Author(s):  
Jie Wei ◽  
Yufeng Nie ◽  
Wenxian Xie

Pearl’s conditioning method is one of the basic algorithms of Bayesian inference, and the loop cutset is crucial for the implementation of conditioning. There are many numerical algorithms for solving the loop cutset, but theoretical research on the characteristics of the loop cutset is lacking. In this paper, theoretical insights into the size and node probability of the loop cutset are obtained based on graph theory and probability theory. It is proven that when the loop cutset in a p-complete graph has a size of p − 2 , the upper bound of the size can be determined by the number of nodes. Furthermore, the probability that a node belongs to the loop cutset is proven to be positively correlated with its degree. Numerical simulations show that the application of the theoretical results can facilitate the prediction and verification of the loop cutset problem. This work is helpful in evaluating the performance of Bayesian networks.


Author(s):  
George A. Hazelrigg ◽  
Georgia-Ann Klutke

Abstract The purpose of this paper is not to present new results; rather, it is to show that the current approach to model validation is not consistent with the accepted mathematics of probability theory. Specifically, we argue that the Sandia V&V Challenge Problem is ill-posed in that the answers sought do not, mathematically, exist. We apply our arguments to show the types of mistakes present in the papers presented in the Journal of Verification, Validation and Uncertainty Quantification, Volume 1,1 along with the challenge problem. Further, we argue that, when the problem is properly posed, both the applicable methodology and the solution techniques are easily drawn from the well-developed mathematics of probability and decision theory. The unfortunate aspect of the challenge problem as currently stated is that it leads to incorrect and inappropriate mathematical approaches that should be avoided and corrected in the current literature.


Sign in / Sign up

Export Citation Format

Share Document