shannon entropy
Recently Published Documents


TOTAL DOCUMENTS

940
(FIVE YEARS 358)

H-INDEX

38
(FIVE YEARS 6)

2022 ◽  
Vol 8 (2) ◽  
Author(s):  
Wendong Wang ◽  
Gaurav Gardi ◽  
Paolo Malgaretti ◽  
Vimal Kishore ◽  
Lyndon Koens ◽  
...  

A local measure based on the Shannon entropy establishes connections among information, structures, and interactions.


2022 ◽  
Author(s):  
Md Abdul Latif Sarker ◽  
Md Fazlul Kader ◽  
Md Mostafa Kamal Sarker ◽  
Moon Lee ◽  
Dong Han

Abstract In this article, we present a black-hole-aided deep-helix (bh-dh) channel model to enhance information bound and mitigate a multiple-helix directional issue in Deoxyribonucleic acid (DNA) communications. The recent observations of DNA do not match with Shannon bound due to their multiple-helix directional issue. Hence, we propose a bh-dh channel model in this paper. The proposed bh-dh channel model follows a similar fashion of DNA and enriches the earlier DNA observations as well as achieving a composite like information bound. To do successfully the proposed bh-dh channel model, we first define a black-hole-aided Bernoulli-process and then consider a symmetric bh-dh channel model. After that, the geometric and graphical insight shows the resemblance of the proposed bh-dh channel model in DNA and Galaxy layout. In our exploration, the proposed bh-dh symmetric channel geometrically sketches a deep-pair-ellipse when a deep-pair information bit or digit is distributed in the proposed channel. Furthermore, the proposed channel graphically shapes as a beautiful circulant ring. The ring contains a central-hole, which looks like a central-black-hole of a Galaxy. The coordinates of the inner-ellipses denote a deep-double helix, and the coordinates of the outer-ellipses sketch a deep-parallel strand. Finally, the proposed bh-dh symmetric channel significantly outperforms the traditional binary-symmetric channel and is verified by computer simulations in terms of Shannon entropy and capacity bound.


2022 ◽  
Vol 12 (1) ◽  
pp. 496
Author(s):  
João Sequeira ◽  
Jorge Louçã ◽  
António M. Mendes ◽  
Pedro G. Lind

We analyze the empirical series of malaria incidence, using the concepts of autocorrelation, Hurst exponent and Shannon entropy with the aim of uncovering hidden variables in those series. From the simulations of an agent model for malaria spreading, we first derive models of the malaria incidence, the Hurst exponent and the entropy as functions of gametocytemia, measuring the infectious power of a mosquito to a human host. Second, upon estimating the values of three observables—incidence, Hurst exponent and entropy—from the data set of different malaria empirical series we predict a value of the gametocytemia for each observable. Finally, we show that the independent predictions show considerable consistency with only a few exceptions which are discussed in further detail.


2022 ◽  
Vol 24 (1) ◽  
pp. 105-118
Author(s):  
Mervat Mahdy ◽  
◽  
Dina S. Eltelbany ◽  
Hoda Mohammed ◽  
◽  
...  

Entropy measures the amount of uncertainty and dispersion of an unknown or random quantity, this concept introduced at first by Shannon (1948), it is important for studies in many areas. Like, information theory: entropy measures the amount of information in each message received, physics: entropy is the basic concept that measures the disorder of the thermodynamical system, and others. Then, in this paper, we introduce an alternative measure of entropy, called 𝐻𝑁- entropy, unlike Shannon entropy, this proposed measure of order α and β is more flexible than Shannon. Then, the cumulative residual 𝐻𝑁- entropy, cumulative 𝐻𝑁- entropy, and weighted version have been introduced. Finally, comparison between Shannon entropy and 𝐻𝑁- entropy and numerical results have been introduced.


2022 ◽  
Vol 7 (4) ◽  
pp. 5328-5346
Author(s):  
Tareq Saeed ◽  
◽  
Muhammad Adil Khan ◽  
Hidayat Ullah ◽  

<abstract><p>The principal aim of this research work is to establish refinements of the integral Jensen's inequality. For the intended refinements, we mainly use the notion of convexity and the concept of majorization. We derive some inequalities for power and quasi–arithmetic means while utilizing the main results. Moreover, we acquire several refinements of Hölder inequality and also an improvement of Hermite–Hadamard inequality as consequences of obtained results. Furthermore, we secure several applications of the acquired results in information theory, which consist bounds for Shannon entropy, different divergences, Bhattacharyya coefficient, triangular discrimination and various distances.</p></abstract>


2022 ◽  
Vol 21 (1) ◽  
pp. 170-177
Author(s):  
Wei-hua MA ◽  
Tong WU ◽  
Zan ZHANG ◽  
Hang LI ◽  
Gong-ming SITU ◽  
...  

4open ◽  
2022 ◽  
Vol 5 ◽  
pp. 1
Author(s):  
David Ellerman

We live in the information age. Claude Shannon, as the father of the information age, gave us a theory of communications that quantified an “amount of information,” but, as he pointed out, “no concept of information itself was defined.” Logical entropy provides that definition. Logical entropy is the natural measure of the notion of information based on distinctions, differences, distinguishability, and diversity. It is the (normalized) quantitative measure of the distinctions of a partition on a set-just as the Boole–Laplace logical probability is the normalized quantitative measure of the elements of a subset of a set. And partitions and subsets are mathematically dual concepts – so the logic of partitions is dual in that sense to the usual Boolean logic of subsets, and hence the name “logical entropy.” The logical entropy of a partition has a simple interpretation as the probability that a distinction or dit (elements in different blocks) is obtained in two independent draws from the underlying set. The Shannon entropy is shown to also be based on this notion of information-as-distinctions; it is the average minimum number of binary partitions (bits) that need to be joined to make all the same distinctions of the given partition. Hence all the concepts of simple, joint, conditional, and mutual logical entropy can be transformed into the corresponding concepts of Shannon entropy by a uniform non-linear dit-bit transform. And finally logical entropy linearizes naturally to the corresponding quantum concept. The quantum logical entropy of an observable applied to a state is the probability that two different eigenvalues are obtained in two independent projective measurements of that observable on that state.


Sign in / Sign up

Export Citation Format

Share Document