Rényi entropy measure of noise-aided information transmission in a binary channel

2010 ◽  
Vol 81 (5) ◽  
Author(s):  
François Chapeau-Blondeau ◽  
David Rousseau ◽  
Agnès Delahaies
Entropy ◽  
2018 ◽  
Vol 20 (9) ◽  
pp. 657 ◽  
Author(s):  
Young-Sik Kim

Since the entropy is a popular randomness measure, there are many studies for the estimation of entropies for given random samples. In this paper, we propose an estimation method of the Rényi entropy of order α . Since the Rényi entropy of order α is a generalized entropy measure including the Shannon entropy as a special case, the proposed estimation method for Rényi entropy can detect any significant deviation of an ergodic stationary random source’s output. It is shown that the expected test value of the proposed scheme is equivalent to the Rényi entropy of order α . After deriving a general representation of parameters of the proposed estimator, we discuss on the particular orders of Rényi entropy such as α → 1 , α = 1 / 2 , and α = 2 . Because the Rényi entropy of order 2 is the most popular one, we present an iterative estimation method for the application with stringent resource restrictions.


2021 ◽  
Vol 2 (4) ◽  
pp. 346-358
Author(s):  
Mina Nasiri ◽  
Hamed Nasiri ◽  
Saeid Nasiri ◽  
Maliheh Bitarafan ◽  
Babak Fazelabdolabadi

This article quantifies the information flow between major equities in the Oil & Gas Midstream and Marine Shipping industries, on the basis of the effective transfer entropy methodology. In addition, the article provides the first analysis of the investors` fear and market expectations in these sectors, according to Rényi entropy approach. The period of study was extended over five years, to fully capture the pre/post-COVID situations. The entropy results reveal a major change in the underlying information flow pattern among equities in the Oil & Gas Midstream and Marine Shipping sectors, in the aftermath of COVID-19. According to the new (post-COVID) paradigm, the stocks in the Oil & Gas Midstream and Integrated Freight & Logistics industries have gained momentum in occupying six of the ten positions within the list of most-influencing equities in the market, in terms of information transmission. The disorder and randomness has decreased for over 89% of the studied equities, after virus outbreak. For the equities detected with high information-transmission standing, the Rényi entropy results indicate that investors more likely showed higher level of future expectations and lower level of fear regarding frequent market events, within the post-COVID timeline. Doi: 10.28991/HIJ-2021-02-04-07 Full Text: PDF


2011 ◽  
Vol 375 (23) ◽  
pp. 2211-2219 ◽  
Author(s):  
François Chapeau-Blondeau ◽  
Agnès Delahaies ◽  
David Rousseau

Author(s):  
Mariza de Andrade ◽  
Xin Wang

In the past few years, several entropy-based tests have been proposed for testing either single SNP association or gene-gene interaction. These tests are mainly based on Shannon entropy and have higher statistical power when compared to standard χ2 tests. In this paper, we extend some of these tests using a more generalized entropy definition, Rényi entropy, where Shannon entropy is a special case of order 1. The order λ (>0) of Rényi entropy weights the events (genotype/haplotype) according to their probabilities (frequencies). Higher λ places more emphasis on higher probability events while smaller λ (close to 0) tends to assign weights more equally. Thus, by properly choosing the λ, one can potentially increase the power of the tests or the p-value level of significance. We conducted simulation as well as real data analyses to assess the impact of the order λ and the performance of these generalized tests. The results showed that for dominant model the order 2 test was more powerful and for multiplicative model the order 1 or 2 had similar power. The analyses indicate that the choice of λ depends on the underlying genetic model and Shannon entropy is not necessarily the most powerful entropy measure for constructing genetic association or interaction tests.


2015 ◽  
Vol 2015 ◽  
pp. 1-8 ◽  
Author(s):  
Young-Seok Choi

This paper presents a data-driven multiscale entropy measure to reveal the scale dependent information quantity of electroencephalogram (EEG) recordings. This work is motivated by the previous observations on the nonlinear and nonstationary nature of EEG over multiple time scales. Here, a new framework of entropy measures considering changing dynamics over multiple oscillatory scales is presented. First, to deal with nonstationarity over multiple scales, EEG recording is decomposed by applying the empirical mode decomposition (EMD) which is known to be effective for extracting the constituent narrowband components without a predetermined basis. Following calculation of Renyi entropy of the probability distributions of the intrinsic mode functions extracted by EMD leads to a data-driven multiscale Renyi entropy. To validate the performance of the proposed entropy measure, actual EEG recordings from ratsn=9experiencing 7 min cardiac arrest followed by resuscitation were analyzed. Simulation and experimental results demonstrate that the use of the multiscale Renyi entropy leads to better discriminative capability of the injury levels and improved correlations with the neurological deficit evaluation after 72 hours after cardiac arrest, thus suggesting an effective diagnostic and prognostic tool.Corrigendum to “Information-Theoretical Quantifier of Brain Rhythm Based on Data-Driven Multiscale Representation”


2020 ◽  
Vol 2020 (12) ◽  
Author(s):  
Jiaju Zhang ◽  
M.A. Rajabpour

Abstract We investigate the Rényi entropy of the excited states produced by the current and its derivatives in the two-dimensional free massless non-compact bosonic theory, which is a two-dimensional conformal field theory. We also study the subsystem Schatten distance between these states. The two-dimensional free massless non-compact bosonic theory is the continuum limit of the finite periodic gapless harmonic chains with the local interactions. We identify the excited states produced by current and its derivatives in the massless bosonic theory as the single-particle excited states in the gapless harmonic chain. We calculate analytically the second Rényi entropy and the second Schatten distance in the massless bosonic theory. We then use the wave functions of the excited states and calculate the second Rényi entropy and the second Schatten distance in the gapless limit of the harmonic chain, which match perfectly with the analytical results in the massless bosonic theory. We verify that in the large momentum limit the single-particle state Rényi entropy takes a universal form. We also show that in the limit of large momenta and large momentum difference the subsystem Schatten distance takes a universal form but it is replaced by a new corrected form when the momentum difference is small. Finally we also comment on the mutual Rényi entropy of two disjoint intervals in the excited states of the two-dimensional free non-compact bosonic theory.


Entropy ◽  
2020 ◽  
Vol 22 (5) ◽  
pp. 526
Author(s):  
Gautam Aishwarya ◽  
Mokshay Madiman

The analogues of Arimoto’s definition of conditional Rényi entropy and Rényi mutual information are explored for abstract alphabets. These quantities, although dependent on the reference measure, have some useful properties similar to those known in the discrete setting. In addition to laying out some such basic properties and the relations to Rényi divergences, the relationships between the families of mutual informations defined by Sibson, Augustin-Csiszár, and Lapidoth-Pfister, as well as the corresponding capacities, are explored.


2010 ◽  
Author(s):  
S. Gabarda ◽  
G. Cristóbal ◽  
P. Rodríguez ◽  
C. Miravet ◽  
J. M. del Cura

2011 ◽  
Vol 2011 (12) ◽  
Author(s):  
Ling-Yan Hung ◽  
Robert C. Myers ◽  
Michael Smolkin ◽  
Alexandre Yale
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document