scholarly journals A Novel Belief Entropy for Measuring Uncertainty in Dempster-Shafer Evidence Theory Framework Based on Plausibility Transformation and Weighted Hartley Entropy

Entropy ◽  
2019 ◽  
Vol 21 (2) ◽  
pp. 163 ◽  
Author(s):  
Qian Pan ◽  
Deyun Zhou ◽  
Yongchuan Tang ◽  
Xiaoyang Li ◽  
Jichuan Huang

Dempster-Shafer evidence theory (DST) has shown its great advantages to tackle uncertainty in a wide variety of applications. However, how to quantify the information-based uncertainty of basic probability assignment (BPA) with belief entropy in DST framework is still an open issue. The main work of this study is to define a new belief entropy for measuring uncertainty of BPA. The proposed belief entropy has two components. The first component is based on the summation of the probability mass function (PMF) of single events contained in each BPA, which are obtained using plausibility transformation. The second component is the same as the weighted Hartley entropy. The two components could effectively measure the discord uncertainty and non-specificity uncertainty found in DST framework, respectively. The proposed belief entropy is proved to satisfy the majority of the desired properties for an uncertainty measure in DST framework. In addition, when BPA is probability distribution, the proposed method could degrade to Shannon entropy. The feasibility and superiority of the new belief entropy is verified according to the results of numerical experiments.

Entropy ◽  
2020 ◽  
Vol 22 (4) ◽  
pp. 487 ◽  
Author(s):  
Miao Qin ◽  
Yongchuan Tang ◽  
Junhao Wen

Dempster–Shafer evidence theory (DS theory) has some superiorities in uncertain information processing for a large variety of applications. However, the problem of how to quantify the uncertainty of basic probability assignment (BPA) in DS theory framework remain unresolved. The goal of this paper is to define a new belief entropy for measuring uncertainty of BPA with desirable properties. The new entropy can be helpful for uncertainty management in practical applications such as decision making. The proposed uncertainty measure has two components. The first component is an improved version of Dubois–Prade entropy, which aims to capture the non-specificity portion of uncertainty with a consideration of the element number in frame of discernment (FOD). The second component is adopted from Nguyen entropy, which captures conflict in BPA. We prove that the proposed entropy satisfies some desired properties proposed in the literature. In addition, the proposed entropy can be reduced to Shannon entropy if the BPA is a probability distribution. Numerical examples are presented to show the efficiency and superiority of the proposed measure as well as an application in decision making.


Mathematics ◽  
2020 ◽  
Vol 8 (12) ◽  
pp. 2137
Author(s):  
Dingyi Gan ◽  
Bin Yang ◽  
Yongchuan Tang

The Dempster–Shafer evidence theory has been widely applied in the field of information fusion. However, when the collected evidence data are highly conflicting, the Dempster combination rule (DCR) fails to produce intuitive results most of the time. In order to solve this problem, the base belief function is proposed to modify the basic probability assignment (BPA) in the exhaustive frame of discernment (FOD). However, in the non-exhaustive FOD, the mass function value of the empty set is nonzero, which makes the base belief function no longer applicable. In this paper, considering the influence of the size of the FOD and the mass function value of the empty set, a new belief function named the extended base belief function (EBBF) is proposed. This method can modify the BPA in the non-exhaustive FOD and obtain intuitive fusion results by taking into account the characteristics of the non-exhaustive FOD. In addition, the EBBF can degenerate into the base belief function in the exhaustive FOD. At the same time, by calculating the belief entropy of the modified BPA, we find that the value of belief entropy is higher than before. Belief entropy is used to measure the uncertainty of information, which can show the conflict more intuitively. The increase of the value of entropy belief is the consequence of conflict. This paper also designs an improved conflict data management method based on the EBBF to verify the rationality and effectiveness of the proposed method.


2020 ◽  
Vol 2020 ◽  
pp. 1-11 ◽  
Author(s):  
Yong Chen ◽  
Yongchuan Tang ◽  
Yan Lei

Uncertainty in data fusion applications has received great attention. Due to the effectiveness and flexibility in handling uncertainty, Dempster–Shafer evidence theory is widely used in numerous fields of data fusion. However, Dempster–Shafer evidence theory cannot be used directly for conflicting sensor data fusion since counterintuitive results may be attained. In order to handle this issue, a new method for data fusion based on weighted belief entropy and the negation of basic probability assignment (BPA) is proposed. First, the negation of BPA is applied to represent the information in a novel view. Then, by measuring the uncertainty of the evidence, the weighted belief entropy is adopted to indicate the relative importance of evidence. Finally, the ultimate weight of each body of evidence is applied to adjust the mass function before fusing by the Dempster combination rule. The validity of the proposed method is demonstrated in accordance with an experiment on artificial data and an application on fault diagnosis.


Entropy ◽  
2018 ◽  
Vol 20 (11) ◽  
pp. 842 ◽  
Author(s):  
Lipeng Pan ◽  
Yong Deng

How to measure the uncertainty of the basic probability assignment (BPA) function is an open issue in Dempster–Shafer (D–S) theory. The main work of this paper is to propose a new belief entropy, which is mainly used to measure the uncertainty of BPA. The proposed belief entropy is based on Deng entropy and probability interval consisting of lower and upper probabilities. In addition, under certain conditions, it can be transformed into Shannon entropy. Numerical examples are used to illustrate the efficiency of the new belief entropy in measurement uncertainty.


1996 ◽  
Vol 26 (2) ◽  
pp. 213-224 ◽  
Author(s):  
Karl-Heinz Waldmann

AbstractRecursions are derived for a class of compound distributions having a claim frequency distribution of the well known (a,b)-type. The probability mass function on which the recursions are usually based is replaced by the distribution function in order to obtain increasing iterates. A monotone transformation is suggested to avoid an underflow in the initial stages of the iteration. The faster increase of the transformed iterates is diminished by use of a scaling function. Further, an adaptive weighting depending on the initial value and the increase of the iterates is derived. It enables us to manage an arbitrary large portfolio. Some numerical results are displayed demonstrating the efficiency of the different methods. The computation of the stop-loss premiums using these methods are indicated. Finally, related iteration schemes based on the cumulative distribution function are outlined.


Author(s):  
Lipeng Pan ◽  
Yong Deng

Dempster-Shafer evidence theory can handle imprecise and unknown information, which has attracted many people. In most cases, the mass function can be translated into the probability, which is useful to expand the applications of the D-S evidence theory. However, how to reasonably transfer the mass function to the probability distribution is still an open issue. Hence, the paper proposed a new probability transform method based on the ordered weighted averaging and entropy difference. The new method calculates weights by ordered weighted averaging, and adds entropy difference as one of the measurement indicators. Then achieved the transformation of the minimum entropy difference by adjusting the parameter r of the weight function. Finally, some numerical examples are given to prove that new method is more reasonable and effective.


Author(s):  
Zixi Han ◽  
Zixian Jiang ◽  
Sophie Ehrt ◽  
Mian Li

Abstract The design of a gas turbine compressor vane carrier (CVC) should meet mechanical integrity requirements on, among others, low-cycle fatigue (LCF). The number of cycles to the LCF failure is the result of cyclic mechanical and thermal strain effects caused by operating conditions on the components. The conventional LCF assessment is usually based on the assumption on standard operating cycles — supplemented by the consideration of predefined extreme operations and safety factors to compensate a potential underestimate on the LCF damage caused by multiple reasons such as non-standard operating cycles. However, real operating cycles can vary significantly from those standard ones considered in the conventional methods. The conventional prediction of LCF life can be very different from real cases, due to the included safety margins. This work presents a probabilistic method to estimate the distributions of the LCF life under varying operating conditions using operational fleet data. Finite element analysis (FEA) results indicate that the first ramp-up loading in each cycle and the turning time before hot-restart cycles are two predominant contributors to the LCF damage. A surrogate model of LCF damage has been built with regard to these two features to reduce the computational cost of FEA. Miner’s rule is applied to calculate the accumulated LCF damage on the component and then obtain the LCF life. The proposed LCF assessment approach has two special points. First, a new data processing technique inspired by the cumulative sum (CUSUM) control chart is proposed to identify the first ramp-up period of each cycle from noised operational data. Second, the probability mass function of the LCF life for a CVC is estimated using the sequential convolution of the single-cycle damage distribution obtained from operational data. The result from the proposed method shows that the mean value of the LCF life at a critical location of the CVC is significantly larger than the calculated result from the deterministic assessment, and the LCF lives for different gas turbines of the same class are also very different. Finally, to avoid high computational cost of sequential convolution, a quick approximation approach for the probability mass function of the LCF life is given. With the capability of dealing with varying operating conditions and noises in the operational data, the enhanced LCF assessment approach proposed in this work provides a probabilistic reference both for reliability analysis in CVC design, and for predictive maintenance in after-sales service.


Author(s):  
Panpan Zhang

In this paper, several properties of a class of trees presenting preferential attachment phenomenon—plane-oriented recursive trees (PORTs) are uncovered. Specifically, we investigate the degree profile of a PORT by determining the exact probability mass function of the degree of a node with a fixed label. We compute the expectation and the variance of degree variable via a Pólya urn approach. In addition, we study a topological index, Zagreb index, of this class of trees. We calculate the exact first two moments of the Zagreb index (of PORTs) by using recurrence methods. Lastly, we determine the limiting degree distribution in PORTs that grow in continuous time, where the embedding is done in a Poissonization framework. We show that it is exponential after proper scaling.


Sign in / Sign up

Export Citation Format

Share Document