scholarly journals Further improvements of some bounds on entropy measures in information theory

1999 ◽  
pp. 599-611
Author(s):  
M. Matić ◽  
Charles E. M. Pearce ◽  
Josip Pečarić
Entropy ◽  
2021 ◽  
Vol 23 (12) ◽  
pp. 1618
Author(s):  
Rubem P. Mondaini ◽  
Simão C. de Albuquerque Neto

The Khinchin–Shannon generalized inequalities for entropy measures in Information Theory, are a paradigm which can be used to test the Synergy of the distributions of probabilities of occurrence in physical systems. The rich algebraic structure associated with the introduction of escort probabilities seems to be essential for deriving these inequalities for the two-parameter Sharma–Mittal set of entropy measures. We also emphasize the derivation of these inequalities for the special cases of one-parameter Havrda–Charvat’s, Rényi’s and Landsberg–Vedral’s entropy measures.


2022 ◽  
Vol 24 (1) ◽  
pp. 105-118
Author(s):  
Mervat Mahdy ◽  
◽  
Dina S. Eltelbany ◽  
Hoda Mohammed ◽  
◽  
...  

Entropy measures the amount of uncertainty and dispersion of an unknown or random quantity, this concept introduced at first by Shannon (1948), it is important for studies in many areas. Like, information theory: entropy measures the amount of information in each message received, physics: entropy is the basic concept that measures the disorder of the thermodynamical system, and others. Then, in this paper, we introduce an alternative measure of entropy, called 𝐻𝑁- entropy, unlike Shannon entropy, this proposed measure of order α and β is more flexible than Shannon. Then, the cumulative residual 𝐻𝑁- entropy, cumulative 𝐻𝑁- entropy, and weighted version have been introduced. Finally, comparison between Shannon entropy and 𝐻𝑁- entropy and numerical results have been introduced.


2001 ◽  
Vol 42 (3) ◽  
pp. 387-398 ◽  
Author(s):  
M. Matić ◽  
C. E. M. Pearce ◽  
J. Pečarić

AbstractRecently Dragomir and Goh have produced some interesting new bounds relating to entropy measures in information theory. We establish several refinements of their results.


Entropy ◽  
2021 ◽  
Vol 23 (10) ◽  
pp. 1361
Author(s):  
Mariano Matilla-García ◽  
Manuel Ruiz Marín

Symbolic analysis has been developed and used successfully in very diverse fields. In recent literature, the contributions of symbolic analysis to the study of complex dynamics and network structures are easy to find, especially those based on symbolic entropy measures (such as permutation entropy) and the symbolic correlation integral (connected with Renyi and Tsallis entropies)[...]


2004 ◽  
Vol 76 (2) ◽  
pp. 425-428 ◽  
Author(s):  
José R.C. Piqueira

Using Shannon information theory is a common strategy to measure any kind of variability in a signal or phenomenon. Some methods were developed to adapt information entropy measures to bird song data trying to emphasize its versatility aspect. This classical approach, using the concept of bit, produces interesting results. Now, the original idea developed in this paper is to use the quantum information theory and the quantum bit (q-bit) concept in order to provide a more complete vision of the experimental results.


2020 ◽  
Author(s):  
Constantinos Papadimitriou ◽  
Georgios Balasis ◽  
Adamantia-Zoe Boutsi ◽  
Omiros GIannakis ◽  
Anastasios Anastasiadis ◽  
...  

<p>Recently, many novel concepts originated in dynamical systems or information theory have been developed, partly motivated by specific research questions linked to geosciences, and found a variety of different applications. This continuously extending toolbox of nonlinear time series analysis highlights the importance of the dynamical complexity to understand the behavior of the complex solar wind – magnetosphere – ionosphere - thermosphere coupling system and its components. Here, we propose to apply such new approaches, mainly a series of entropy methods to the time series of the Earth's magnetic field measured by the Swarm constellation. Swarm is an ESA mission launched on November 22, 2013, comprising three satellites at low Earth polar orbits. The mission delivers data that provide new insight into the Earth's system by improving our understanding of the Earth's interior as well as the near-Earth electromagnetic environment. We show successful applications of methods originated in information theory to quantitatively studying complexity in the dynamical response of the topside ionosphere, at Swarm altitudes, focusing on the most intense magnetic storms of the present solar cycle.</p>


2004 ◽  
Vol 60 (2) ◽  
pp. 144-163 ◽  
Author(s):  
Jodi Kearns ◽  
Brian O'Connor

This study explores the use of the information theory entropy equation in representations of videos for children. The calculated rates of information in the videos are calibrated to the corresponding perceived rates of information as elicited from the 12 seven‐ to ten‐year‐old girls who were shown video documents. Entropy measures are calculated for several video elements: set time, set incidence, verbal time, verbal incidence, set constraint, nonverbal dependence, and character appearance. As hypothesized, mechanically calculated entropy measure (CEM) was found to be sufficiently similar to perceived entropy measure (PEM) made by children so that they can be used as useful and predictive elements of representations of children's videos. The relationships between the CEM and the PEM show that CEM could stand for PEM in order to enrich representations for video documents for this age group.


Author(s):  
Francisco Torrens ◽  
Gloria Castellano

Numerous definitions for complexity have been proposed with little consensus. The definition here is related to Kolmogorov complexity and Shannon entropy measures. However, the price is to introduce context dependence into the definition of complexity. Such context dependence is an inherent property of complexity. Scientists are uncomfortable with such context dependence that smacks of subjectivity, which is the reason why little agreement is found on the meaning of the terms. In an article published in Molecules, Lin presented a novel approach for assessing molecular diversity based on Shannon information theory. A set of compounds is viewed as a static collection of microstates that can register information about their environment. The method is characterized by a strong tendency to oversample remote areas of the feature space and produce unbalanced designs. This chapter demonstrates the limitation with some simple examples and provides a rationale for the failure to produce results that are consistent.


2013 ◽  
Vol 3 (6) ◽  
pp. 20130030 ◽  
Author(s):  
Minus van Baalen

Evolution can be characterized as a process that shapes and maintains information across generations. It is also widely acknowledged that information may play a pivotal role in many other ecological processes. Most of the ecologically relevant information (and some important evolutionary information too) is of a very subjective and analogue kind: individuals use cues that may carry information useful only to them but not to others. This is a problem because most information theory has been developed for objective and discrete information. Can information theory be extended to this theory to incorporate multiple forms of information, each with its own (physical) carriers and dynamics? Here, I will not review all the possible roles that information can play, but rather what conditions an appropriate theory should satisfy. The most promising starting point is provided by entropy measures of conditional probabilities (using the so-called Kullback–Leibler divergence), allowing an assessment of how acquiring information can lead to an increase in fitness. It is irrelevant (to a certain extent) where the information comes from—genes, experience or culture—but it is important to realize that information is not merely subjective but its value should be evaluated in fitness terms, and it is here that evolutionary theory has an enormous potential. A number of important stumbling points remain, however; namely, the identification of whose fitness it concerns and what role the spatio-temporal dynamics plays (which is tightly linked to the nature of the physical carriers of the information and the processes that impact on it).


Sign in / Sign up

Export Citation Format

Share Document