shannon information
Recently Published Documents


TOTAL DOCUMENTS

196
(FIVE YEARS 39)

H-INDEX

28
(FIVE YEARS 4)

Research ◽  
2021 ◽  
Vol 2021 ◽  
pp. 1-9
Author(s):  
Maria Solyanik-Gorgone ◽  
Jiachi Ye ◽  
Mario Miscuglio ◽  
Andrei Afanasev ◽  
Alan E. Willner ◽  
...  

While information is ubiquitously generated, shared, and analyzed in a modern-day life, there is still some controversy around the ways to assess the amount and quality of information inside a noisy optical channel. A number of theoretical approaches based on, e.g., conditional Shannon entropy and Fisher information have been developed, along with some experimental validations. Some of these approaches are limited to a certain alphabet, while others tend to fall short when considering optical beams with a nontrivial structure, such as Hermite-Gauss, Laguerre-Gauss, and other modes with a nontrivial structure. Here, we propose a new definition of the classical Shannon information via the Wigner distribution function, while respecting the Heisenberg inequality. Following this definition, we calculate the amount of information in Gaussian, Hermite-Gaussian, and Laguerre-Gaussian laser modes in juxtaposition and experimentally validate it by reconstruction of the Wigner distribution function from the intensity distribution of structured laser beams. We experimentally demonstrate the technique that allows to infer field structure of the laser beams in singular optics to assess the amount of contained information. Given the generality, this approach of defining information via analyzing the beam complexity is applicable to laser modes of any topology that can be described by well-behaved functions. Classical Shannon information, defined in this way, is detached from a particular alphabet, i.e., communication scheme, and scales with the structural complexity of the system. Such a synergy between the Wigner distribution function encompassing the information in both real and reciprocal space and information being a measure of disorder can contribute into future coherent detection algorithms and remote sensing.


Author(s):  
Maurizio Magarini ◽  
Pasquale Stano

In this Perspective article we intend to focus on the opportunity of modelling Shannon information and/or “semantic” information in the field originated by the convergence of bottom-up synthetic biology (in particular, the construction of “synthetic cells”) and the engineering approaches to molecular communication. In particular we will argue that the emerging technology of synthetic cell fabrication will allow novel opportunities to study nano-scale communication and manipulation of information in unprecedented manner. More specifically, we will discuss the possibility of enquiring on the transfer and manipulation of information in the chemical domain, and interpreting such a dynamics according to Shannon or to MacKay-Bateson (“semantic” information).


Author(s):  
Edward Bormashenko ◽  
Irina Legchenkova ◽  
Mark Frenkel ◽  
Nir Shvalb ◽  
Shraga Shoval

Informational (Shannon) measures of symmetry are introduced and analyzed for the patterns built of 1D and 2D shapes. The informational measure of symmetry Hsym (G) characterizes the an averaged uncertainty in the presence of symmetry elements from the group G in a given pattern; whereas the Shannon-like measure of symmetry Ωsym (G) quantifies averaged uncertainty of appearance of shapes possessing in total n elements of symmetry belonging to group G in a given pattern. Hsym(G1)=Ωsym(G1)=0 for the patterns built of irregular, non-symmetric shapes. Both of informational measures of symmetry are intensive parameters of the pattern and do not depend on the number of shapes, their size and area of the pattern. They are also insensitive to the long-range order inherent for the pattern. Informational measures of symmetry of fractal patterns are addressed. The mixed patterns including curves and shapes are considered. Time evolution of the Shannon measures of symmetry is treated. The close-packed and dispersed 2D patterns are analyzed.


2021 ◽  
Vol 104 (2) ◽  
Author(s):  
Gabriella Dantas Franco ◽  
Flavia Maria Darcie Marquitti ◽  
Lucas D. Fernandes ◽  
Dan Braha ◽  
Marcus Aloizio Martinez de Aguiar

Entropy ◽  
2021 ◽  
Vol 23 (8) ◽  
pp. 1021
Author(s):  
James Fullwood ◽  
Arthur J. Parzygnat

We provide a stochastic extension of the Baez–Fritz–Leinster characterization of the Shannon information loss associated with a measure-preserving function. This recovers the conditional entropy and a closely related information-theoretic measure that we call conditional information loss. Although not functorial, these information measures are semi-functorial, a concept we introduce that is definable in any Markov category. We also introduce the notion of an entropic Bayes’ rule for information measures, and we provide a characterization of conditional entropy in terms of this rule.


Author(s):  
Olimpia Lombardi ◽  
Cristian López
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document