scholarly journals Quantification of Margins and Uncertainties Approach for Structure Analysis Based on Evidence Theory

2016 ◽  
Vol 2016 ◽  
pp. 1-5 ◽  
Author(s):  
Chaoyang Xie ◽  
Guijie Li

Quantification of Margins and Uncertainties (QMU) is a decision-support methodology for complex technical decisions centering on performance thresholds and associated margins for engineering systems. Uncertainty propagation is a key element in QMU process for structure reliability analysis at the presence of both aleatory uncertainty and epistemic uncertainty. In order to reduce the computational cost of Monte Carlo method, a mixed uncertainty propagation approach is proposed by integrated Kriging surrogate model under the framework of evidence theory for QMU analysis in this paper. The approach is demonstrated by a numerical example to show the effectiveness of the mixed uncertainty propagation method.

Author(s):  
Alessandra Cuneo ◽  
Alberto Traverso ◽  
Shahrokh Shahpar

In engineering design, uncertainty is inevitable and can cause a significant deviation in the performance of a system. Uncertainty in input parameters can be categorized into two groups: aleatory and epistemic uncertainty. The work presented here is focused on aleatory uncertainty, which can cause natural, unpredictable and uncontrollable variations in performance of the system under study. Such uncertainty can be quantified using statistical methods, but the main obstacle is often the computational cost, because the representative model is typically highly non-linear and complex. Therefore, it is necessary to have a robust tool that can perform the uncertainty propagation with as few evaluations as possible. In the last few years, different methodologies for uncertainty propagation and quantification have been proposed. The focus of this study is to evaluate four different methods to demonstrate strengths and weaknesses of each approach. The first method considered is Monte Carlo simulation, a sampling method that can give high accuracy but needs a relatively large computational effort. The second method is Polynomial Chaos, an approximated method where the probabilistic parameters of the response function are modelled with orthogonal polynomials. The third method considered is Mid-range Approximation Method. This approach is based on the assembly of multiple meta-models into one model to perform optimization under uncertainty. The fourth method is the application of the first two methods not directly to the model but to a response surface representing the model of the simulation, to decrease computational cost. All these methods have been applied to a set of analytical test functions and engineering test cases. Relevant aspects of the engineering design and analysis such as high number of stochastic variables and optimised design problem with and without stochastic design parameters were assessed. Polynomial Chaos emerges as the most promising methodology, and was then applied to a turbomachinery test case based on a thermal analysis of a high-pressure turbine disk.


Author(s):  
Zhe Zhang ◽  
Chao Jiang ◽  
G. Gary Wang ◽  
Xu Han

Evidence theory has a strong ability to deal with the epistemic uncertainty, based on which the uncertain parameters existing in many complex engineering problems with limited information can be conveniently treated. However, the heavy computational cost caused by its discrete property severely influences the practicability of evidence theory, which has become a main difficulty in structural reliability analysis using evidence theory. This paper aims to develop an efficient method to evaluate the reliability for structures with evidence variables, and hence improves the applicability of evidence theory for engineering problems. A non-probabilistic reliability index approach is introduced to obtain a design point on the limit-state surface. An assistant area is then constructed through the obtained design point, based on which a small number of focal elements can be picked out for extreme analysis instead of using all the elements. The vertex method is used for extreme analysis to obtain the minimum and maximum values of the limit-state function over a focal element. A reliability interval composed of the belief measure and the plausibility measure is finally obtained for the structure. Two numerical examples are investigated to demonstrate the effectiveness of the proposed method.


2008 ◽  
Vol 130 (9) ◽  
Author(s):  
Xiaoping Du

Two types of uncertainty exist in engineering. Aleatory uncertainty comes from inherent variations while epistemic uncertainty derives from ignorance or incomplete information. The former is usually modeled by the probability theory and has been widely researched. The latter can be modeled by the probability theory or nonprobability theories and is much more difficult to deal with. In this work, the effects of both types of uncertainty are quantified with belief and plausibility measures (lower and upper probabilities) in the context of the evidence theory. Input parameters with aleatory uncertainty are modeled with probability distributions by the probability theory. Input parameters with epistemic uncertainty are modeled with basic probability assignments by the evidence theory. A computational method is developed to compute belief and plausibility measures for black-box performance functions. The proposed method involves the nested probabilistic analysis and interval analysis. To handle black-box functions, we employ the first order reliability method for probabilistic analysis and nonlinear optimization for interval analysis. Two example problems are presented to demonstrate the proposed method.


2021 ◽  
Vol 9 (4) ◽  
pp. 34-40
Author(s):  
Duc Ky Bui ◽  
Ngoc Quynh Nguyen ◽  
Duc Thang Duong ◽  
Ngoc Thiem Le ◽  
Quang Tuan Ho ◽  
...  

Evaluating measurement uncertainty of a physical quantity is a mandatory requirement for laboratories within the recognition ISO/IEC 17025 certification to access reliability of measured results. In this work, the uncertainty of ionizing radiation measurements such as air-kerma, personal dose equivalent Hp(d) was evaluated based on GUM method and Monte Carlo method. An uncertainty propagation software has been developed for evaluation of the measurement uncertainty more convenient.


Acta Numerica ◽  
2018 ◽  
Vol 27 ◽  
pp. 113-206 ◽  
Author(s):  
Nawaf Bou-Rabee ◽  
J. M. Sanz-Serna

This paper surveys in detail the relations between numerical integration and the Hamiltonian (or hybrid) Monte Carlo method (HMC). Since the computational cost of HMC mainly lies in the numerical integrations, these should be performed as efficiently as possible. However, HMC requires methods that have the geometric properties of being volume-preserving and reversible, and this limits the number of integrators that may be used. On the other hand, these geometric properties have important quantitative implications for the integration error, which in turn have an impact on the acceptance rate of the proposal. While at present the velocity Verlet algorithm is the method of choice for good reasons, we argue that Verlet can be improved upon. We also discuss in detail the behaviour of HMC as the dimensionality of the target distribution increases.


2017 ◽  
Vol 14 (02) ◽  
pp. 1750012 ◽  
Author(s):  
Longxiang Xie ◽  
Jian Liu ◽  
Jinan Zhang ◽  
Xianfeng Man

Evidence theory has a strong capacity to deal with epistemic uncertainty, in view of the overestimation in interval analysis, the responses of structural-acoustic problem with epistemic uncertainty could be untreated. In this paper, a numerical method is proposed for structural-acoustic system response analysis under epistemic uncertainties based on evidence theory. To improve the calculation accuracy and reduce the computational cost, the interval analysis technique and radial point interpolation method are adopted to obtain the approximate frequency response characteristics for each focal element, and the corresponding formulations of structural-acoustic system for interval response analysis are deduced. Numerical examples are introduced to illustrate the efficiency of the proposed method.


2018 ◽  
Vol 35 (7) ◽  
pp. 2480-2501
Author(s):  
Hesheng Tang ◽  
Dawei Li ◽  
Lixin Deng ◽  
Songtao Xue

Purpose This paper aims to develop a comprehensive uncertainty quantification method using evidence theory for Park–Ang damage index-based performance design in which epistemic uncertainties are considered. Various sources of uncertainty emanating from the database of the cyclic test results of RC members provided by the Pacific Earthquake Engineering Research Center are taken into account. Design/methodology/approach In this paper, an uncertainty quantification methodology based on evidence theory is presented for the whole process of performance-based seismic design (PBSD), while considering uncertainty in the Park–Ang damage model. To alleviate the burden of high computational cost in propagating uncertainty, the differential evolution interval optimization strategy is used for efficiently finding the propagated belief structure throughout the whole design process. Findings The investigation results of this paper demonstrate that the uncertainty rooted in Park–Ang damage model have a significant influence on PBSD design and evaluation. It might be worth noting that the epistemic uncertainty present in the Park–Ang damage model needs to be considered to avoid underestimating the true uncertainty. Originality/value This paper presents an evidence theory-based uncertainty quantification framework for the whole process of PBSD.


Sign in / Sign up

Export Citation Format

Share Document