scholarly journals Approximating Information Measures for Fields

Entropy ◽  
2020 ◽  
Vol 22 (1) ◽  
pp. 79 ◽  
Author(s):  
Łukasz Dębowski

We supply corrected proofs of the invariance of completion and the chain rule for the Shannon information measures of arbitrary fields, as stated by Dębowski in 2009. Our corrected proofs rest on a number of auxiliary approximation results for Shannon information measures, which may be of an independent interest. As also discussed briefly in this article, the generalized calculus of Shannon information measures for fields, including the invariance of completion and the chain rule, is useful in particular for studying the ergodic decomposition of stationary processes and its links with statistical modeling of natural language.

Author(s):  
GIUSEPPE BUSANELLO ◽  
GIULIANELLA COLETTI ◽  
BARBARA VANTAGGI

We deal with conditional decomposable information measures, directly defined as functions on a suitable set of conditional events satisfying a class of axioms. For these general measures we introduce a notion of independence and study its main properties in order to compare it with classical definitions present in the literature. The particular case of Wiener-Shannon information measure is taken in consideration and the links between the provided independence for information measures and the independence for the underlying probability are analyzed.


1989 ◽  
Vol 35 (2) ◽  
pp. 271-274 ◽  
Author(s):  
L H Bernstein ◽  
C J Leukhardt-Fairfield ◽  
W Pleban ◽  
R Rudolph

Abstract In this ongoing study, albumin and prealbumin (transthyretin) changes were compared in 40 patients managed with enteral and (or) parental support with attainment of caloric/protein goals. The concentration of prealbumin in serum changed rapidly and more accurately reflected current nutritional status of these patients than did that of albumin. We determined concentrations of albumin and prealbumin that reflected significant improvement in nutritional status, using Rudolph's approach based on Shannon information measures. Reference values for albumin and prealbumin in the treatment populations were 25 g/L and 107 mg/L, respectively. A prealbumin concentration of 135 mg/L or greater reflected a return to stable status.


Entropy ◽  
2021 ◽  
Vol 23 (8) ◽  
pp. 1021
Author(s):  
James Fullwood ◽  
Arthur J. Parzygnat

We provide a stochastic extension of the Baez–Fritz–Leinster characterization of the Shannon information loss associated with a measure-preserving function. This recovers the conditional entropy and a closely related information-theoretic measure that we call conditional information loss. Although not functorial, these information measures are semi-functorial, a concept we introduce that is definable in any Markov category. We also introduce the notion of an entropic Bayes’ rule for information measures, and we provide a characterization of conditional entropy in terms of this rule.


Author(s):  
Stephen R Cole ◽  
Jessie K Edwards ◽  
Sander Greenland

Abstract Measures of information and surprise, such as the Shannon information value (S value), quantify the signal present in a stream of noisy data. We illustrate the use of such information measures in the context of interpreting P values as compatibility indices. S values help communicate the limited information supplied by conventional statistics and cast a critical light on cutoffs used to judge and construct those statistics. Misinterpretations of statistics may be reduced by interpreting P values and interval estimates using compatibility concepts and S values instead of “significance” and “confidence.”


Open Physics ◽  
2008 ◽  
Vol 6 (1) ◽  
Author(s):  
Piotr Garbaczewski

AbstractWe carry out a systematic study of uncertainty measures that are generic to dynamical processes of varied origins, provided they induce suitable continuous probability distributions. The major technical tools are the information theory methods and inequalities satisfied by Fisher and Shannon information measures. We focus on the compatibility of these inequalities with the prescribed (deterministic, random or quantum) temporal behavior of pertinent probability densities.


Author(s):  
Edward Bormashenko ◽  
Irina Legchenkova ◽  
Mark Frenkel ◽  
Nir Shvalb ◽  
Shraga Shoval

Informational (Shannon) measures of symmetry are introduced and analyzed for the patterns built of 1D and 2D shapes. The informational measure of symmetry Hsym (G) characterizes the an averaged uncertainty in the presence of symmetry elements from the group G in a given pattern; whereas the Shannon-like measure of symmetry Ωsym (G) quantifies averaged uncertainty of appearance of shapes possessing in total n elements of symmetry belonging to group G in a given pattern. Hsym(G1)=Ωsym(G1)=0 for the patterns built of irregular, non-symmetric shapes. Both of informational measures of symmetry are intensive parameters of the pattern and do not depend on the number of shapes, their size and area of the pattern. They are also insensitive to the long-range order inherent for the pattern. Informational measures of symmetry of fractal patterns are addressed. The mixed patterns including curves and shapes are considered. Time evolution of the Shannon measures of symmetry is treated. The close-packed and dispersed 2D patterns are analyzed.


Sign in / Sign up

Export Citation Format

Share Document