The information theoretic entropy function as a total expected participation index for communication network experiments

Psychometrika ◽  
1966 ◽  
Vol 31 (2) ◽  
pp. 249-254 ◽  
Author(s):  
Kenneth D. Mackenzie
2018 ◽  
Vol 140 (12) ◽  
pp. S16-S23
Author(s):  
Hanieh Agharazi ◽  
Wanchat Theeranaew ◽  
Kolacinski Richard M. ◽  
Kenneth A. Lopaor

We propose an information-theoretic framework for modeling complex systems as a communication network where physical devices can be organized into subsystems and subsystems are communicating through an information channel governed by the dynamics of the system.


2011 ◽  
Vol 61 (5) ◽  
pp. 415 ◽  
Author(s):  
Madasu Hanmandlu ◽  
Anirban Das

<p>Content-based image retrieval focuses on intuitive and efficient methods for retrieving images from databases based on the content of the images. A new entropy function that serves as a measure of information content in an image termed as 'an information theoretic measure' is devised in this paper. Among the various query paradigms, 'query by example' (QBE) is adopted to set a query image for retrieval from a large image database. In this paper, colour and texture features are extracted using the new entropy function and the dominant colour is considered as a visual feature for a particular set of images. Thus colour and texture features constitute the two-dimensional feature vector for indexing the images. The low dimensionality of the feature vector speeds up the atomic query. Indices in a large database system help retrieve the images relevant to the query image without looking at every image in the database. The entropy values of colour and texture and the dominant colour are considered for measuring the similarity. The utility of the proposed image retrieval system based on the information theoretic measures is demonstrated on a benchmark dataset.</p><p><strong>Defence Science Journal, 2011, 61(5), pp.415-430</strong><strong><strong>, DOI:http://dx.doi.org/10.14429/dsj.61.1177</strong></strong></p>


1993 ◽  
Vol 7 (3) ◽  
pp. 413-420 ◽  
Author(s):  
Pietro Muliere ◽  
Giovanni Parmigiani ◽  
Nicholas G. Polson

Interest in the informational content of truncation motivates the study of the residual entropy function, that is, the entropy of a right truncated random variable as a function of the truncation point. In this note we show that, under mild regularity conditions, the residual entropy function characterizes the probability distribution. We also derive relationships among residual entropy, monotonicity of the failure rate, and stochastic dominance. Information theoretic measures of distances between distributions are also revisited from a similar perspective. In particular, we study the residual divergence between two positive random variables and investigate some of its monotonicity properties. The results are relevant to information theory, reliability theory, search problems, and experimental design.


Author(s):  
Ryan Ka Yau Lai ◽  
Youngah Do

This article explores a method of creating confidence bounds for information-theoretic measures in linguistics, such as entropy, Kullback-Leibler Divergence (KLD), and mutual information. We show that a useful measure of uncertainty can be derived from simple statistical principles, namely the asymptotic distribution of the maximum likelihood estimator (MLE) and the delta method. Three case studies from phonology and corpus linguistics are used to demonstrate how to apply it and examine its robustness against common violations of its assumptions in linguistics, such as insufficient sample size and non-independence of data points.


Sign in / Sign up

Export Citation Format

Share Document