SIGNATURE-BASED INFORMATION MEASURES OF MULTI-STATE NETWORKS

2018 ◽  
Vol 33 (3) ◽  
pp. 438-459 ◽  
Author(s):  
S. Zarezadeh ◽  
M. Asadi ◽  
S. Eftekhar

The signature matrix of an n-component three-state network (system), which depends only on the network structure, is a useful tool for comparing the reliability and stochastic properties of networks. In this paper, we consider a three-state network with states up, partial performance, and down. We assume that the network remains in state up, for a random time T1 and then moves to state partial performance until it fails at time T>T1. The signature-based expressions for the conditional entropy of T given T1, the joint entropy, Kullback-Leibler (K-L) information, and mutual information of the lifetimes T and T1 are presented. It is shown that the K-L information, and mutual information between T1 and T depend only on the network structure (i.e., depend only to the signature matrix of the network). Some signature-based stochastic comparisons are also made to compare the K-L of the state lifetimes in two different three-state networks. Upper and lower bounds for the K-L divergence and mutual information between T1 and T are investigated. Finally the results are extended to n-component multi-state networks. Several examples are examined graphically and numerically.

2005 ◽  
Vol 17 (4) ◽  
pp. 741-778 ◽  
Author(s):  
Eric E. Thomson ◽  
William B. Kristan

Performance in sensory discrimination tasks is commonly quantified using either information theory or ideal observer analysis. These two quantitative frameworks are often assumed to be equivalent. For example, higher mutual information is said to correspond to improved performance of an ideal observer in a stimulus estimation task. To the contrary, drawing on and extending previous results, we show that five information-theoretic quantities (entropy, response-conditional entropy, specific information, equivocation, and mutual information) violate this assumption. More positively, we show how these information measures can be used to calculate upper and lower bounds on ideal observer performance, and vice versa. The results show that the mathematical resources of ideal observer analysis are preferable to information theory for evaluating performance in a stimulus discrimination task. We also discuss the applicability of information theory to questions that ideal observer analysis cannot address.


Author(s):  
QINGHUA HU ◽  
DAREN YU

Yager's entropy was proposed to compute the information of fuzzy indiscernibility relation. In this paper we present a novel interpretation of Yager's entropy in discernibility power of a relation point of view. Then some basic definitions in Shannon's information theory are generalized based on Yager's entropy. We introduce joint entropy, conditional entropy, mutual information and relative entropy to compute the information changes for fuzzy indiscerniblity relation operations. Conditional entropy and relative conditional entropy are proposed to measure the information increment, which is interpreted as the significance of an attribute in fuzzy rough set model. As an application, we redefine independency of an attribute set, reduct, relative reduct in fuzzy rough set model based on Yager's entropy. Some experimental results show the proposed approach is suitable for fuzzy and numeric data reduction.


2001 ◽  
Vol 42 (4) ◽  
pp. 515-531
Author(s):  
S. S. Dragomir ◽  
C. J. Goh

AbstractWe apply superadditivity and monotonicity properties associated with the Jensen discrete inequality to derive relationships between the entropy function of a probability vector and a renormalized arbitrary sub-vector. The results are extended to cover other entropy measures such as joint entropy, conditional entropy and mutual information.


Author(s):  
Shi Dong ◽  
Wengang Zhou

Influential node identification plays an important role in optimizing network structure. Many measures and identification methods are proposed for this purpose. However, the current network system is more complex, the existing methods are difficult to deal with these networks. In this paper, several basic measures are introduced and discussed and we propose an improved influential nodes identification method that adopts the hybrid mechanism of information entropy and weighted degree of edge to improve the accuracy of identification (Hm-shell). Our proposed method is evaluated by comparing with nine algorithms in nine datasets. Theoretical analysis and experimental results on real datasets show that our method outperforms other methods on performance.


2014 ◽  
Vol 51 (4) ◽  
pp. 999-1020 ◽  
Author(s):  
S. Ashrafi ◽  
M. Asadi

This paper is an investigation into the reliability and stochastic properties of three-state networks. We consider a single-step network consisting of n links and we assume that the links are subject to failure. We assume that the network can be in three states, up (K = 2), partial performance (K = 1), and down (K = 0). Using the concept of the two-dimensional signature, we study the residual lifetimes of the networks under different scenarios on the states and the number of failed links of the network. In the process of doing so, we define variants of the concept of the dynamic signature in a bivariate setting. Then, we obtain signature based mixture representations of the reliability of the residual lifetimes of the network states under the condition that the network is in state K = 2 (or K = 1) and exactly k links in the network have failed. We prove preservation theorems showing that stochastic orderings and dependence between the elements of the dynamic signatures (which relies on the network structure) are preserved by the residual lifetimes of the states of the network (which relies on the network ageing). Various illustrative examples are also provided.


Entropy ◽  
2018 ◽  
Vol 20 (7) ◽  
pp. 540 ◽  
Author(s):  
Subhashis Hazarika ◽  
Ayan Biswas ◽  
Soumya Dutta ◽  
Han-Wei Shen

Uncertainty of scalar values in an ensemble dataset is often represented by the collection of their corresponding isocontours. Various techniques such as contour-boxplot, contour variability plot, glyphs and probabilistic marching-cubes have been proposed to analyze and visualize ensemble isocontours. All these techniques assume that a scalar value of interest is already known to the user. Not much work has been done in guiding users to select the scalar values for such uncertainty analysis. Moreover, analyzing and visualizing a large collection of ensemble isocontours for a selected scalar value has its own challenges. Interpreting the visualizations of such large collections of isocontours is also a difficult task. In this work, we propose a new information-theoretic approach towards addressing these issues. Using specific information measures that estimate the predictability and surprise of specific scalar values, we evaluate the overall uncertainty associated with all the scalar values in an ensemble system. This helps the scientist to understand the effects of uncertainty on different data features. To understand in finer details the contribution of individual members towards the uncertainty of the ensemble isocontours of a selected scalar value, we propose a conditional entropy based algorithm to quantify the individual contributions. This can help simplify analysis and visualization for systems with more members by identifying the members contributing the most towards overall uncertainty. We demonstrate the efficacy of our method by applying it on real-world datasets from material sciences, weather forecasting and ocean simulation experiments.


Sign in / Sign up

Export Citation Format

Share Document