A new evidence-theory-based method for response analysis of acoustic system with epistemic uncertainty by using Jacobi expansion

2017 ◽  
Vol 322 ◽  
pp. 419-440 ◽  
Author(s):  
Shengwen Yin ◽  
Dejie Yu ◽  
Hui Yin ◽  
Baizhan Xia
2017 ◽  
Vol 14 (02) ◽  
pp. 1750012 ◽  
Author(s):  
Longxiang Xie ◽  
Jian Liu ◽  
Jinan Zhang ◽  
Xianfeng Man

Evidence theory has a strong capacity to deal with epistemic uncertainty, in view of the overestimation in interval analysis, the responses of structural-acoustic problem with epistemic uncertainty could be untreated. In this paper, a numerical method is proposed for structural-acoustic system response analysis under epistemic uncertainties based on evidence theory. To improve the calculation accuracy and reduce the computational cost, the interval analysis technique and radial point interpolation method are adopted to obtain the approximate frequency response characteristics for each focal element, and the corresponding formulations of structural-acoustic system for interval response analysis are deduced. Numerical examples are introduced to illustrate the efficiency of the proposed method.


2019 ◽  
Vol 9 (7) ◽  
pp. 1457 ◽  
Author(s):  
Zhiliang Huang ◽  
Jiaqi Xu ◽  
Tongguang Yang ◽  
Fangyi Li ◽  
Shuguang Deng

The conventional engineering robustness optimization approach considering uncertainties is generally based on a probabilistic model. However, a probabilistic model faces obstacles when handling problems with epistemic uncertainty. This paper presents an evidence-theory-based robustness optimization (EBRO) model and a corresponding algorithm, which provide a potential computational tool for engineering problems with multi-source uncertainty. An EBRO model with the twin objectives of performance and robustness is formulated by introducing the performance threshold. After providing multiple target belief measures (Bel), the original model is transformed into a series of sub-problems, which are solved by the proposed iterative strategy driving the robustness analysis and the deterministic optimization alternately. The proposed method is applied to three problems of micro-electromechanical systems (MEMS), including a micro-force sensor, an image sensor, and a capacitive accelerometer. In the applications, finite element simulation models and surrogate models are both given. Numerical results show that the proposed method has good engineering practicality due to comprehensive performance in terms of efficiency, accuracy, and convergence.


Author(s):  
Zhe Zhang ◽  
Chao Jiang ◽  
G. Gary Wang ◽  
Xu Han

Evidence theory has a strong ability to deal with the epistemic uncertainty, based on which the uncertain parameters existing in many complex engineering problems with limited information can be conveniently treated. However, the heavy computational cost caused by its discrete property severely influences the practicability of evidence theory, which has become a main difficulty in structural reliability analysis using evidence theory. This paper aims to develop an efficient method to evaluate the reliability for structures with evidence variables, and hence improves the applicability of evidence theory for engineering problems. A non-probabilistic reliability index approach is introduced to obtain a design point on the limit-state surface. An assistant area is then constructed through the obtained design point, based on which a small number of focal elements can be picked out for extreme analysis instead of using all the elements. The vertex method is used for extreme analysis to obtain the minimum and maximum values of the limit-state function over a focal element. A reliability interval composed of the belief measure and the plausibility measure is finally obtained for the structure. Two numerical examples are investigated to demonstrate the effectiveness of the proposed method.


2008 ◽  
Vol 130 (9) ◽  
Author(s):  
Xiaoping Du

Two types of uncertainty exist in engineering. Aleatory uncertainty comes from inherent variations while epistemic uncertainty derives from ignorance or incomplete information. The former is usually modeled by the probability theory and has been widely researched. The latter can be modeled by the probability theory or nonprobability theories and is much more difficult to deal with. In this work, the effects of both types of uncertainty are quantified with belief and plausibility measures (lower and upper probabilities) in the context of the evidence theory. Input parameters with aleatory uncertainty are modeled with probability distributions by the probability theory. Input parameters with epistemic uncertainty are modeled with basic probability assignments by the evidence theory. A computational method is developed to compute belief and plausibility measures for black-box performance functions. The proposed method involves the nested probabilistic analysis and interval analysis. To handle black-box functions, we employ the first order reliability method for probabilistic analysis and nonlinear optimization for interval analysis. Two example problems are presented to demonstrate the proposed method.


2011 ◽  
Vol 20 (04) ◽  
pp. 691-720 ◽  
Author(s):  
JIANBING MA ◽  
WEIRU LIU ◽  
DIDIER DUBOIS ◽  
HENRI PRADE

Belief revision characterizes the process of revising an agent's beliefs when receiving new evidence. In the field of artificial intelligence, revision strategies have been extensively studied in the context of logic-based formalisms and probability kinematics. However, so far there is not much literature on this topic in evidence theory. In contrast, combination rules proposed so far in the theory of evidence, especially Dempster rule, are symmetric. They rely on a basic assumption, that is, pieces of evidence being combined are considered to be on a par, i.e. play the same role. When one source of evidence is less reliable than another, it is possible to discount it and then a symmetric combination operation is still used. In the case of revision, the idea is to let prior knowledge of an agent be altered by some input information. The change problem is thus intrinsically asymmetric. Assuming the input information is reliable, it should be retained whilst the prior information should be changed minimally to that effect. To deal with this issue, this paper defines the notion of revision for the theory of evidence in such a way as to bring together probabilistic and logical views. Several revision rules previously proposed are reviewed and we advocate one of them as better corresponding to the idea of revision. It is extended to cope with inconsistency between prior and input information. It reduces to Dempster rule of combination, just like revision in the sense of Alchourrón, Gärdenfors, and Makinson (AGM) reduces to expansion, when the input is strongly consistent with the prior belief function. Properties of this revision rule are also investigated and it is shown to generalize Jeffrey's rule of updating, Dempster rule of conditioning and a form of AGM revision.


Sign in / Sign up

Export Citation Format

Share Document