information theory
Recently Published Documents


TOTAL DOCUMENTS

5987
(FIVE YEARS 1022)

H-INDEX

115
(FIVE YEARS 11)

10.1142/12668 ◽  
2022 ◽  
Author(s):  
John Scales Avery
Keyword(s):  

2022 ◽  
Author(s):  
Kevin Song ◽  
Dmitrii E Makarov ◽  
Etienne Vouga

A key theoretical challenge posed by single-molecule studies is the inverse problem of deducing the underlying molecular dynamics from the time evolution of low-dimensional experimental observables. Toward this goal, a variety of low-dimensional models have been proposed as descriptions of single-molecule signals, including random walks with or without conformational memory and/or with static or dynamics disorder. Differentiating among different models presents a challenge, as many distinct physical scenarios lead to similar experimentally observable behaviors such as anomalous diffusion and nonexponential relaxation. Here we show that information-theory-based analysis of single-molecule time series, inspired by Shannon's work studying the information content of printed English, can differentiate between Markov (memoryless) and non-Markov single-molecule signals and between static and dynamic disorder. In particular, non-Markov time series are more predictable and thus can be compressed and transmitted within shorter messages (i.e. have a lower entropy rate) than appropriately constructed Markov approximations, and we demonstrate that in practice the LZMA compression algorithm reliably differentiates between these entropy rates across several simulated dynamical models.


Information ◽  
2022 ◽  
Vol 13 (1) ◽  
pp. 39
Author(s):  
Neri Merhav

In this work, we propose both an improvement and extensions of a reverse Jensen inequality due to Wunder et al. (2021). The new proposed inequalities are fairly tight and reasonably easy to use in a wide variety of situations, as demonstrated in several application examples that are relevant to information theory. Moreover, the main ideas behind the derivations turn out to be applicable to generate bounds to expectations of multivariate convex/concave functions, as well as functions that are not necessarily convex or concave.


2022 ◽  
Author(s):  
Lokman Cevik ◽  
Marilyn Vazquez Landrove ◽  
Mehmet Tahir Aslan ◽  
Vasilii Khammad ◽  
Francisco Jose Garagorry Guerra ◽  
...  
Keyword(s):  

2022 ◽  
Vol 0 (0) ◽  
Author(s):  
Eugene Y. S. Chua

Abstract Lakatos’s analysis of progress and degeneration in the Methodology of Scientific Research Programmes is well-known. Less known, however, are his thoughts on degeneration in Proofs and Refutations. I propose and motivate two new criteria for degeneration based on the discussion in Proofs and Refutations – superfluity and authoritarianism. I show how these criteria augment the account in Methodology of Scientific Research Programmes, providing a generalized Lakatosian account of progress and degeneration. I then apply this generalized account to a key transition point in the history of entropy – the transition to an information-theoretic interpretation of entropy – by assessing Jaynes’s 1957 paper on information theory and statistical mechanics.


2022 ◽  
Author(s):  
Miron Bartosz Kursa

Abstract Kendall transformation is a conversion of an ordered feature into a vector of pairwise order relations between individual values. This way, it preserves ranking of observations and represents it in a categorical form. Such transformation allows for generalisation of methods requiring strictly categorical input, especially in the limit of small number of observations, when discretisation becomes problematic.In particular, many approaches of information theory can be directly applied to Kendall-transformed continuous data without relying on differential entropy or any additional parameters. Moreover, by filtering information to this contained in ranking, Kendall transformation leads to a better robustness at a reasonable cost of dropping sophisticated interactions which are anyhow unlikely to be correctly estimated. In bivariate analysis, Kendall transformation can be related to popular non-parametric methods, showing the soundness of the approach.The paper also demonstrates its efficiency in multivariate problems, as well as provides an example analysis of a real-world data.


Sensors ◽  
2022 ◽  
Vol 22 (2) ◽  
pp. 443
Author(s):  
Ildeberto Santos-Ruiz ◽  
Francisco-Ronay López-Estrada ◽  
Vicenç Puig ◽  
Guillermo Valencia-Palomo ◽  
Héctor-Ricardo Hernández

This paper presents a method for optimal pressure sensor placement in water distribution networks using information theory. The criterion for selecting the network nodes where to place the pressure sensors was that they provide the most useful information for locating leaks in the network. Considering that the node pressures measured by the sensors can be correlated (mutual information), a subset of sensor nodes in the network was chosen. The relevance of information was maximized, and information redundancy was minimized simultaneously. The selection of the nodes where to place the sensors was performed on datasets of pressure changes caused by multiple leak scenarios, which were synthetically generated by simulation using the EPANET software application. In order to select the optimal subset of nodes, the candidate nodes were ranked using a heuristic algorithm with quadratic computational cost, which made it time-efficient compared to other sensor placement algorithms. The sensor placement algorithm was implemented in MATLAB and tested on the Hanoi network. It was verified by exhaustive analysis that the selected nodes were the best combination to place the sensors and detect leaks.


Entropy ◽  
2022 ◽  
Vol 24 (1) ◽  
pp. 91
Author(s):  
Chris Rourk

A newly discovered physical mechanism involving incoherent electron tunneling in layers of the protein ferritin that are found in catecholaminergic neurons (catecholaminergic neuron electron transport or CNET) is hypothesized to support communication between neurons. Recent tests further confirm that these ferritin layers can also perform a switching function (in addition to providing an electron tunneling mechanism) that could be associated with action selection in those neurons, consistent with earlier predictions based on CNET. While further testing would be needed to confirm the hypothesis that CNET allows groups of neurons to communicate and act as a switch for selecting one of the neurons in the group to assist in reaching action potential, this paper explains how that hypothesized behavior would be consistent with Integrated Information Theory (IIT), one of a number of consciousness theories (CTs). While the sheer number of CTs suggest that any one of them alone is not sufficient to explain consciousness, this paper demonstrates that CNET can provide a physical substrate and action selection mechanism that is consistent with IIT and which can also be applied to other CTs, such as to conform them into a single explanation of consciousness.


2022 ◽  
Vol 24 (1) ◽  
pp. 105-118
Author(s):  
Mervat Mahdy ◽  
◽  
Dina S. Eltelbany ◽  
Hoda Mohammed ◽  
◽  
...  

Entropy measures the amount of uncertainty and dispersion of an unknown or random quantity, this concept introduced at first by Shannon (1948), it is important for studies in many areas. Like, information theory: entropy measures the amount of information in each message received, physics: entropy is the basic concept that measures the disorder of the thermodynamical system, and others. Then, in this paper, we introduce an alternative measure of entropy, called 𝐻𝑁- entropy, unlike Shannon entropy, this proposed measure of order α and β is more flexible than Shannon. Then, the cumulative residual 𝐻𝑁- entropy, cumulative 𝐻𝑁- entropy, and weighted version have been introduced. Finally, comparison between Shannon entropy and 𝐻𝑁- entropy and numerical results have been introduced.


Sign in / Sign up

Export Citation Format

Share Document