The Relationship Between Logical Entropy and Shannon Entropy

2021 ◽  
pp. 15-22
Author(s):  
David Ellerman
2021 ◽  
Author(s):  
E. Castedo Ellerman

What is the relationship between the variance and entropy of independent one-hot vectors? This document proves inequalities relating variance, collision entropy and Shannon entropy of sequences of independent one-hot vectors.


SAGE Open ◽  
2020 ◽  
Vol 10 (1) ◽  
pp. 215824401989904
Author(s):  
Wenyi Wang ◽  
Lihong Song ◽  
Teng Wang ◽  
Peng Gao ◽  
Jian Xiong

The purpose of this study is to investigate the relationship between the Shannon entropy procedure and the Jensen–Shannon divergence (JSD) that are used as item selection criteria in cognitive diagnostic computerized adaptive testing (CD-CAT). Because the JSD itself is defined by the Shannon entropy, we apply the well-known relationship between the JSD and Shannon entropy to establish a relationship between the item selection criteria that are based on these two measures. To understand the relationship between these two item selection criteria better, an alternative way is also provided. Theoretical derivations and empirical examples have shown that the Shannon entropy procedure and the JSD in CD-CAT have a linear relation under cognitive diagnostic models. Consistent with our theoretical conclusions, simulation results have shown that two item selection criteria behaved quite similarly in terms of attribute-level and pattern recovery rates under all conditions and they selected the same set of items for each examinee from an item bank with item parameters drawn from a uniform distribution U(0.1, 0.3) under post hoc simulations. We provide some suggestions for future studies and a discussion of relationship between the modified posterior-weighted Kullback–Leibler index and the G-DINA (generalized deterministic inputs, noisy “and” gate) discrimination index.


2022 ◽  
Vol 2022 (1) ◽  
pp. 013403
Author(s):  
Liping Lian ◽  
Xu Mai ◽  
Weiguo Song ◽  
Jun Zhang ◽  
Kwok Kit Richard Yuen ◽  
...  

Abstract Merging pedestrian flow can be observed often at public intersections and locations where two or more channels merge. Because of restrictions on the flow, pedestrian congestion, or even crowd disasters (e.g. Hajj crush 2015) happen easily at these junctions. However, studies on merging behaviors in large crowds remain rare. This paper investigates the merging characteristics of the pedestrian flow with controlled experiments under laboratory conditions. The formation of lanes is observed, and the lane separation width can vary for different density levels. Shannon entropy is used to analyze the utilization of the passage. The space usage in the merging area is most efficient when the width of the two branches is half that of the main corridor. Furthermore, the branch and main channel can mutually bottleneck each other in the large crowds and the flowrates for the upstream, downstream and branches are investigated. This study uses spatiotemporal diagrams to explore the clogging propagation of the merging flow as well as the relationship of the velocity and position. The results can be used as references for the design of public infrastructure and human safety management.


Author(s):  
Boubaker Adel ◽  
Sahli Lamia

In this study, we evaluate the relationship between efficiency and probability of the crash, thus the evolution of the daily informational efficiency is measured for the indie stock market index. The efficiency, which is the issue addressed by the weak-form efficient market hypothesis, is calculated using a new method the Shannon entropy and the symbolic time series analysis. A logit model is applied in order to study the relationship between efficiency and probability of the financial crash.


2021 ◽  
Author(s):  
William B. Ashe ◽  
Sarah E. Innis ◽  
Julia N. Shanno ◽  
Camille J. Hochheimer ◽  
Ronald D. Williams ◽  
...  

AbstractRationaleBreathing motion (respiratory kinematics) can be characterized by the interval and depth of each breath, and by magnitude-synchrony relationships between locations. Such characteristics and their breath-by-breath variability might be useful indicators of respiratory health.ObjectivesTo enable breath-by-breath characterization of respiratory kinematics, we developed a method to detect breaths using motion sensor signals.MethodsIn 34 volunteers who underwent maximal exercise testing, we used 8 motion sensors to record upper rib, lower rib and abdominal kinematics at 3 exercise stages (rest, lactate threshold and exhaustion). We recorded volumetric air flow signals using clinical exercise laboratory equipment and synchronized them with kinematic signals. Using instantaneous phase landmarks from the analytic representation of kinematic and flow signals, we identified individual breaths and derived respiratory rate signals at 1Hz. To evaluate the fidelity of kinematics-derived respiratory rate signals, we calculated their cross-correlation with the flow-derived respiratory rate signals. To identify coupling between kinematics and flow, we calculated the Shannon entropy of the relative frequency with which kinematic phase landmarks were distributed over the phase of the flow cycle.Measurements and Main ResultsWe found good agreement in the kinematics-derived and flow-derived respiratory rate signals, with cross-correlation coefficients as high as 0.94. In some individuals, the kinematics and flow were significantly coupled (Shannon entropy < 2) but the relationship varied within (by exercise stage) and between individuals. The final result was that the phase landmarks from the kinematic signal were uniformly distributed over the phase of the air flow signals (Shannon entropy close to the theoretical maximum of 3.32).ConclusionsThe Analysis of Respiratory Kinematics method can yield highly resolved respiratory rate signals by separating individual breaths. This method will facilitate characterization of clinically significant breathing motion patterns on a breath-by-breath basis. The relationship between respiratory kinematics and flow is much more complex than expected, varying between and within individuals.


4open ◽  
2022 ◽  
Vol 5 ◽  
pp. 1
Author(s):  
David Ellerman

We live in the information age. Claude Shannon, as the father of the information age, gave us a theory of communications that quantified an “amount of information,” but, as he pointed out, “no concept of information itself was defined.” Logical entropy provides that definition. Logical entropy is the natural measure of the notion of information based on distinctions, differences, distinguishability, and diversity. It is the (normalized) quantitative measure of the distinctions of a partition on a set-just as the Boole–Laplace logical probability is the normalized quantitative measure of the elements of a subset of a set. And partitions and subsets are mathematically dual concepts – so the logic of partitions is dual in that sense to the usual Boolean logic of subsets, and hence the name “logical entropy.” The logical entropy of a partition has a simple interpretation as the probability that a distinction or dit (elements in different blocks) is obtained in two independent draws from the underlying set. The Shannon entropy is shown to also be based on this notion of information-as-distinctions; it is the average minimum number of binary partitions (bits) that need to be joined to make all the same distinctions of the given partition. Hence all the concepts of simple, joint, conditional, and mutual logical entropy can be transformed into the corresponding concepts of Shannon entropy by a uniform non-linear dit-bit transform. And finally logical entropy linearizes naturally to the corresponding quantum concept. The quantum logical entropy of an observable applied to a state is the probability that two different eigenvalues are obtained in two independent projective measurements of that observable on that state.


2013 ◽  
Vol 07 (02) ◽  
pp. 121-145 ◽  
Author(s):  
DAVID ELLERMAN

The logical basis for information theory is the newly developed logic of partitions that is dual to the usual Boolean logic of subsets. The key concept is a "distinction" of a partition, an ordered pair of elements in distinct blocks of the partition. The logical concept of entropy based on partition logic is the normalized counting measure of the set of distinctions of a partition on a finite set — just as the usual logical notion of probability based on the Boolean logic of subsets is the normalized counting measure of the subsets (events). Thus logical entropy is a measure on the set of ordered pairs, and all the compound notions of entropy (join entropy, conditional entropy, and mutual information) arise in the usual way from the measure (e.g. the inclusion-exclusion principle) — just like the corresponding notions of probability. The usual Shannon entropy of a partition is developed by replacing the normalized count of distinctions (dits) by the average number of binary partitions (bits) necessary to make all the distinctions of the partition.


Sign in / Sign up

Export Citation Format

Share Document