Introduction to Information Theory

Author(s):  
M. Vidyasagar

This chapter provides an introduction to some elementary aspects of information theory, including entropy in its various forms. Entropy refers to the level of uncertainty associated with a random variable (or more precisely, the probability distribution of the random variable). When there are two or more random variables, it is worthwhile to study the conditional entropy of one random variable with respect to another. The last concept is relative entropy, also known as the Kullback–Leibler divergence, which measures the “disparity” between two probability distributions. The chapter first considers convex and concave functions before discussing the properties of the entropy function, conditional entropy, uniqueness of the entropy function, and the Kullback–Leibler divergence.

1993 ◽  
Vol 7 (3) ◽  
pp. 413-420 ◽  
Author(s):  
Pietro Muliere ◽  
Giovanni Parmigiani ◽  
Nicholas G. Polson

Interest in the informational content of truncation motivates the study of the residual entropy function, that is, the entropy of a right truncated random variable as a function of the truncation point. In this note we show that, under mild regularity conditions, the residual entropy function characterizes the probability distribution. We also derive relationships among residual entropy, monotonicity of the failure rate, and stochastic dominance. Information theoretic measures of distances between distributions are also revisited from a similar perspective. In particular, we study the residual divergence between two positive random variables and investigate some of its monotonicity properties. The results are relevant to information theory, reliability theory, search problems, and experimental design.


Author(s):  
Munteanu Bogdan Gheorghe

Based on the Weibull-G Power probability distribution family, we have proposed a new family of probability distributions, named by us the Max Weibull-G power series distributions, which may be applied in order to solve some reliability problems. This implies the fact that the Max Weibull-G power series is the distribution of a random variable max (X1 ,X2 ,...XN) where X1 ,X2 ,... are Weibull-G distributed independent random variables and N is a natural random variable the distribution of which belongs to the family of power series distribution. The main characteristics and properties of this distribution are analyzed.


Psihologija ◽  
2007 ◽  
Vol 40 (1) ◽  
pp. 5-35
Author(s):  
Aleksandar Kostic ◽  
Milena Bozic

In this study we investigate the constraints on probability distribution of grammatical forms within morphological paradigms of Serbian language, where paradigm is specified as a coherent set of elements with defined criteria for inclusion. Thus, for example, in Serbian all feminine nouns that end with the suffix "a" in their nominative singular form belong to the third declension, the declension being a paradigm. The notion of a paradigm could be extended to other criteria as well, hence, we can think of noun cases, irrespective of grammatical number and gender, or noun gender, irrespective of case and grammatical number, also as paradigms. We took the relative entropy as a measure of homogeneity of probability distribution within paradigms. The analysis was performed on 116 morphological paradigms of typical Serbian and for each paradigm the relative entropy has been calculated. The obtained results indicate that for most paradigms the relative entropy values fall within a range of 0.75 - 0.9. Nonhomogeneous distribution of relative entropy values allows for estimating the relative entropy of the morphological system as a whole. This value is 0.69 and can tentatively be taken as an index of stability of the morphological system.


2014 ◽  
Vol 15 (3) ◽  
pp. 195-203 ◽  
Author(s):  
Qing Xiao

Abstract This paper employs the generalized lambda distribution (GLD) to model random variables with various probability distributions in power system. In the context of the probability weighted moment (PWM), an optimization-free method is developed to assess the parameters of GLD. By equating the first four PWMs of GLD with those of the target random variable, a polynomial equation with one unknown is derived to solve for the parameters of GLD. When employing GLD to model correlated multivariate random variables, a method of accommodating the dependency is put forward. Finally, three examples are worked to demonstrate the proposed method.


Author(s):  
Olesya Martyniuk ◽  
Stepan Popina ◽  
Serhii Martyniuk

Introduction. Mathematical modeling of economic processes is necessary for the unambiguous formulation and solution of the problem. In the economic sphere this is the most important aspect of the activity of any enterprise, for which economic-mathematical modeling is the tool that allows to make adequate decisions. However, economic indicators that are factors of a model are usually random variables. An economic-mathematical model is proposed for calculating the probability distribution function of the result of economic activity on the basis of the known dependence of this result on factors influencing it and density of probability distribution of these factors. Methods. The formula was used to calculate the random variable probability distribution function, which is a function of other independent random variables. The method of estimation of basic numerical characteristics of the investigated functions of random variables is proposed: mathematical expectation that in the probabilistic sense is the average value of the result of functioning of the economic structure, as well as its variance. The upper bound of the variation of the effective feature is indicated. Results. The cases of linear and power functions of two independent variables are investigated. Different cases of two-dimensional domain of possible values of indicators, which are continuous random variables, are considered. The application of research results to production functions is considered. Examples of estimating the probability distribution function of a random variable are offered. Conclusions. The research results allow in the probabilistic sense to estimate the result of the economic structure activity on the basis of the probabilistic distributions of the values of the dependent variables. The prospect of further research is to apply indirect control over economic performance based on economic and mathematical modeling.


Author(s):  
Mohammad Shakil ◽  
Dr. Mohammad Ahsanullah ◽  
Dr. B. M. G. Kibria Kibria

For a non-negative continuous random variable , Chaudhry and Zubair (2002, p. 19) introduced a probability distribution with a completely monotonic probability density function based on the generalized gamma function, and called it the Macdonald probability function. In this paper, we establish various basic distributional properties of Chaudhry and Zubair’s Macdonald probability distribution. Since the percentage points of a given distribution are important for any statistical applications, we have also computed the percentage points for different values of the parameter involved. Based on these properties, we establish some new characterization results of Chaudhry and Zubair’s Macdonald probability distribution by the left and right truncated moments, order statistics and record values. Characterizations of certain other continuous probability distributions with completely monotonic probability density functions such as Mckay, Pareto and exponential distributions are also discussed by the proposed characterization techniques.   


Author(s):  
QINGHUA HU ◽  
DAREN YU

Yager's entropy was proposed to compute the information of fuzzy indiscernibility relation. In this paper we present a novel interpretation of Yager's entropy in discernibility power of a relation point of view. Then some basic definitions in Shannon's information theory are generalized based on Yager's entropy. We introduce joint entropy, conditional entropy, mutual information and relative entropy to compute the information changes for fuzzy indiscerniblity relation operations. Conditional entropy and relative conditional entropy are proposed to measure the information increment, which is interpreted as the significance of an attribute in fuzzy rough set model. As an application, we redefine independency of an attribute set, reduct, relative reduct in fuzzy rough set model based on Yager's entropy. Some experimental results show the proposed approach is suitable for fuzzy and numeric data reduction.


1987 ◽  
Vol 109 (3) ◽  
pp. 366-371 ◽  
Author(s):  
R. P. Sukhija ◽  
A. C. Rao

The information theory based on probability concept was developed for application in communication engineering. This theory utilizes the concept of “entropy,” a measure of uncertainty. The errors existing in the output of a path-generating linkage and desired path can be taken as random variables. The entropy function is formulated in terms of design parameters. Minimization of maximum entropy leads to the synthesis of a path-generating mechanism. Optimum allocation of tolerances on link lengths is also carried out.


2011 ◽  
Vol 2011 ◽  
pp. 1-13 ◽  
Author(s):  
Linda Smail

Bayesian Networks are graphic probabilistic models through which we can acquire, capitalize on, and exploit knowledge. they are becoming an important tool for research and applications in artificial intelligence and many other fields in the last decade. This paper presents Bayesian networks and discusses the inference problem in such models. It proposes a statement of the problem and the proposed method to compute probability distributions. It also uses D-separation for simplifying the computation of probabilities in Bayesian networks. Given a Bayesian network over a family of random variables, this paper presents a result on the computation of the probability distribution of a subset of using separately a computation algorithm and D-separation properties. It also shows the uniqueness of the obtained result.


Entropy ◽  
2021 ◽  
Vol 23 (7) ◽  
pp. 896
Author(s):  
Evaristo José Madarro-Capó ◽  
Carlos Miguel Legón-Pérez ◽  
Omar Rojas ◽  
Guillermo Sosa-Gómez

This paper presents a criterion, based on information theory, to measure the amount of average information provided by the sequences of outputs of the RC4 on the internal state. The test statistic used is the sum of the maximum plausible estimates of the entropies H(jt|zt), corresponding to the probability distributions P(jt|zt) of the sequences of random variables (jt)t∈T and (zt)t∈T, independent, but not identically distributed, where zt are the known values of the outputs, while jt is one of the unknown elements of the internal state of the RC4. It is experimentally demonstrated that the test statistic allows for determining the most vulnerable RC4 outputs, and it is proposed to be used as a vulnerability metric for each RC4 output sequence concerning the iterative probabilistic attack.


Sign in / Sign up

Export Citation Format

Share Document