scholarly journals 𝑯𝑵- Entropy: A New Measureof Information And Its Properties

2022 ◽  
Vol 24 (1) ◽  
pp. 105-118
Author(s):  
Mervat Mahdy ◽  
◽  
Dina S. Eltelbany ◽  
Hoda Mohammed ◽  
◽  
...  

Entropy measures the amount of uncertainty and dispersion of an unknown or random quantity, this concept introduced at first by Shannon (1948), it is important for studies in many areas. Like, information theory: entropy measures the amount of information in each message received, physics: entropy is the basic concept that measures the disorder of the thermodynamical system, and others. Then, in this paper, we introduce an alternative measure of entropy, called 𝐻𝑁- entropy, unlike Shannon entropy, this proposed measure of order α and β is more flexible than Shannon. Then, the cumulative residual 𝐻𝑁- entropy, cumulative 𝐻𝑁- entropy, and weighted version have been introduced. Finally, comparison between Shannon entropy and 𝐻𝑁- entropy and numerical results have been introduced.

2017 ◽  
Vol 54 (2) ◽  
pp. 379-393 ◽  
Author(s):  
A. Toomaj ◽  
S. M. Sunoj ◽  
J. Navarro

Abstract Recently, Rao et al. (2004) introduced an alternative measure of uncertainty known as the cumulative residual entropy (CRE). It is based on the survival (reliability) function F̅ instead of the probability density function f used in classical Shannon entropy. In reliability based system design, the performance characteristics of the coherent systems are of great importance. Accordingly, in this paper, we study the CRE for coherent and mixed systems when the component lifetimes are identically distributed. Bounds for the CRE of the system lifetime are obtained. We use these results to propose a measure to study if a system is close to series and parallel systems of the same size. Our results suggest that the CRE can be viewed as an alternative entropy (dispersion) measure to classical Shannon entropy.


2012 ◽  
Vol 27 (28) ◽  
pp. 1250164
Author(s):  
J. MANUEL GARCÍA-ISLAS

In the three-dimensional spin foam model of quantum gravity with a cosmological constant, there exists a set of observables associated with spin network graphs. A set of probabilities is calculated from these observables, and hence the associated Shannon entropy can be defined. We present the Shannon entropy associated with these observables and find some interesting bounded inequalities. The problem relates measurements, entropy and information theory in a simple way which we explain.


2021 ◽  
Author(s):  
Haengjin Choe

AbstractSince the publication of Shannon’s article about information theory, there have been many attempts to apply information theory to the field of neuroscience. Meanwhile, the Weber–Fechner law of psychophysics states that the magnitude of a subjective sensation of a person increases in proportion to the logarithm of the intensity of the external physical-stimulus. It is not surprising that we assign the amount of information to the response in the Weber–Fechner law. But no one has succeeded in applying information theory directly to that law: the direct links between information theory and that response in the Weber–Fechner law have not yet been found. The proposed theory unveils a link between information theory and that response, and differs subtly from the field such as neural coding that involves complicated calculations and models. Because our theory targets the Weber–Fechner law which is a macroscopic phenomenon, this theory does not involve complicated calculations. Our theory is expected to mark a new era in the fields of sensory perception research. Our theory must be studied in parallel with the fields of microscopic scale such as neural coding. This article ultimately aim to provide the fundamental concepts and their applications so that a new field of research on stimuli and responses can be created.


1987 ◽  
Vol 19 (3) ◽  
pp. 385-394 ◽  
Author(s):  
J R Roy

In the use of information theory for the development of forecasting models, two alternative approaches can be used, based either on Shannon entropy or on Kullback information gain. In this paper, a new approach is presented, which combines the usually superior statistical inference powers of the Kullback procedure with the advantages of the availability of calibrated ‘elasticity’ parameters in the Shannon approach. Situations are discussed where the combined approach is preferable to either of the two existing procedures, and the principles are illustrated with the help of a small numerical example.


2017 ◽  
Vol 15 (05) ◽  
pp. 1750036
Author(s):  
Feng-Ming Liu ◽  
Mei-Ling Jin

The research on information quantization is important in the field of information theory. As a result, based on the quantum theory, the information was quantified from the information receiving aspect in this report. First of all, several concepts were presented, such as the InfoBar, the Amount of Information and the Power of Information as well as the algorithm of the Power of Information. Then, according to the relationship between the InfoBar and the amount of Information, the wave equation was decided based on the receiving information, meanwhile, the equation of wave function was defined as well. Finally, via the numerical simulation, the received model results as well as the sample result were basically matched. Thus, the validity of the model can be proved.


2020 ◽  
Author(s):  
K. Hauke Kraemer ◽  
Norbert Marwan ◽  
Karoline Wiesner ◽  
Jürgen Kurths

<p>Many dynamical processes in Earth Sciences are the product of many interacting components and have often limited predictability, not least because they can exhibit regime transitions (e.g. tipping points).To quantify complexity, entropy measures such as the Shannon entropy of the value distribution are widely used. Amongst other more sophisticated ideas, a number of entropy measures based on recurrence plots have been suggested. Because different structures, e.g. diagonal lines, of the recurrence plot are used for the estimation of probabilities, these entropy measures represent different aspects of the analyzed system and, thus, behave differently. In the past, this fact has led to difficulties in interpreting and understanding those measures. We review the definitions, the motivation and interpretation of these entropy measures, compare their differences and discuss some of the pitfalls when using them.</p><p>Finally, we illustrate their potential in an application on paleoclimate time series. Using the presented entropy measures, changes and transitions in the climate dynamics in the past can be identified and interpreted.</p>


Author(s):  
Aurelio Fernández Bariviera ◽  
María Belén Guercio ◽  
Lisana B. Martinez ◽  
Osvaldo A. Rosso

This paper analyses Libor interest rates for seven different maturities and referred to operations in British pounds, euros, Swiss francs and Japanese yen, during the period 2001–2015. The analysis is performed by means of two quantifiers derived from information theory: the permutation Shannon entropy and the permutation Fisher information measure. An anomalous behaviour in the Libor is detected in all currencies except euros during the years 2006–2012. The stochastic switch is more severe in one, two and three months maturities. Given the special mechanism of Libor setting, we conjecture that the behaviour could have been produced by the manipulation that was uncovered by financial authorities. We argue that our methodology is pertinent as a market overseeing instrument.


Entropy ◽  
2021 ◽  
Vol 23 (12) ◽  
pp. 1618
Author(s):  
Rubem P. Mondaini ◽  
Simão C. de Albuquerque Neto

The Khinchin–Shannon generalized inequalities for entropy measures in Information Theory, are a paradigm which can be used to test the Synergy of the distributions of probabilities of occurrence in physical systems. The rich algebraic structure associated with the introduction of escort probabilities seems to be essential for deriving these inequalities for the two-parameter Sharma–Mittal set of entropy measures. We also emphasize the derivation of these inequalities for the special cases of one-parameter Havrda–Charvat’s, Rényi’s and Landsberg–Vedral’s entropy measures.


1999 ◽  
pp. 599-611
Author(s):  
M. Matić ◽  
Charles E. M. Pearce ◽  
Josip Pečarić

Sign in / Sign up

Export Citation Format

Share Document