information theoretic
Recently Published Documents


TOTAL DOCUMENTS

4532
(FIVE YEARS 936)

H-INDEX

100
(FIVE YEARS 11)

Author(s):  
Carlos Baladrón ◽  
Andrei Khrennikov

Closed timelike curves (CTCs), non-intuitive theoretical solutions of general relativity field equations can be modelled in quantum mechanics in a way, known as Deutsch-CTCs, to circumvent one of their most paradoxical implications, namely, the so-called grandfather paradox. An outstanding theoretical result of this model is the demonstration that in the presence of a Deutsch-CTC a classical computer would be computationally equivalent to a quantum computer. In the present study, the possible implications of such a striking result for the foundations of quantum mechanics and the connections between classicality and quantumness are explored. To this purpose, a model for fundamental particles that interact in physical space exchanging carriers of momentum and energy is considered. Every particle is then supplemented with an information space in which a probabilistic classical Turing machine is stored. It is analysed whether, through the action of Darwinian evolution, both a classical algorithm coding the rules of quantum mechanics and an anticipation module might plausibly be developed on the information space from initial random behaviour. The simulation of a CTC on the information space of the particle by means of the anticipation module would imply that fundamental particles, which do not possess direct intrinsic quantum features from first principles in this information-theoretic Darwinian approach, could however generate quantum emergent behaviour in real time as a consequence of Darwinian evolution acting on information-theoretic physical systems.


Information ◽  
2022 ◽  
Vol 13 (1) ◽  
pp. 39
Author(s):  
Neri Merhav

In this work, we propose both an improvement and extensions of a reverse Jensen inequality due to Wunder et al. (2021). The new proposed inequalities are fairly tight and reasonably easy to use in a wide variety of situations, as demonstrated in several application examples that are relevant to information theory. Moreover, the main ideas behind the derivations turn out to be applicable to generate bounds to expectations of multivariate convex/concave functions, as well as functions that are not necessarily convex or concave.


2022 ◽  
pp. 1-27
Author(s):  
Clifford Bohm ◽  
Douglas Kirkpatrick ◽  
Arend Hintze

Abstract Deep learning (primarily using backpropagation) and neuroevolution are the preeminent methods of optimizing artificial neural networks. However, they often create black boxes that are as hard to understand as the natural brains they seek to mimic. Previous work has identified an information-theoretic tool, referred to as R, which allows us to quantify and identify mental representations in artificial cognitive systems. The use of such measures has allowed us to make previous black boxes more transparent. Here we extend R to not only identify where complex computational systems store memory about their environment but also to differentiate between different time points in the past. We show how this extended measure can identify the location of memory related to past experiences in neural networks optimized by deep learning as well as a genetic algorithm.


2022 ◽  
Vol 0 (0) ◽  
Author(s):  
Eugene Y. S. Chua

Abstract Lakatos’s analysis of progress and degeneration in the Methodology of Scientific Research Programmes is well-known. Less known, however, are his thoughts on degeneration in Proofs and Refutations. I propose and motivate two new criteria for degeneration based on the discussion in Proofs and Refutations – superfluity and authoritarianism. I show how these criteria augment the account in Methodology of Scientific Research Programmes, providing a generalized Lakatosian account of progress and degeneration. I then apply this generalized account to a key transition point in the history of entropy – the transition to an information-theoretic interpretation of entropy – by assessing Jaynes’s 1957 paper on information theory and statistical mechanics.


2022 ◽  
Vol 9 ◽  
Author(s):  
Keerthi Adusumilli ◽  
Bradford Brown ◽  
Joey Harrison ◽  
Matthew Koehler ◽  
Jason Kutarnia ◽  
...  

The structure and dynamics of modern United States Federal Case Law are examined here. The analyses utilize large-scale network analysis tools, natural language processing techniques, and information theory to examine all the federal opinions in the Court Listener database, containing approximately 1.3 million judicial opinions and 11.4 million citations. The analyses are focused on modern United States Federal Case Law, as cases in the Court Listener database range from approximately 1926–2020 and include most Federal jurisdictions. We examine the data set from a structural perspective using the citation network, overall and by time and space (jurisdiction). In addition to citation structure, we examine the dataset from a topical and information theoretic perspective, again, overall and by time and space.


2022 ◽  
Vol 15 ◽  
Author(s):  
Sergio Vicencio-Jimenez ◽  
Mario Villalobos ◽  
Pedro E. Maldonado ◽  
Rodrigo C. Vergara

It is still elusive to explain the emergence of behavior and understanding based on its neural mechanisms. One renowned proposal is the Free Energy Principle (FEP), which uses an information-theoretic framework derived from thermodynamic considerations to describe how behavior and understanding emerge. FEP starts from a whole-organism approach, based on mental states and phenomena, mapping them into the neuronal substrate. An alternative approach, the Energy Homeostasis Principle (EHP), initiates a similar explanatory effort but starts from single-neuron phenomena and builds up to whole-organism behavior and understanding. In this work, we further develop the EHP as a distinct but complementary vision to FEP and try to explain how behavior and understanding would emerge from the local requirements of the neurons. Based on EHP and a strict naturalist approach that sees living beings as physical and deterministic systems, we explain scenarios where learning would emerge without the need for volition or goals. Given these starting points, we state several considerations of how we see the nervous system, particularly the role of the function, purpose, and conception of goal-oriented behavior. We problematize these conceptions, giving an alternative teleology-free framework in which behavior and, ultimately, understanding would still emerge. We reinterpret neural processing by explaining basic learning scenarios up to simple anticipatory behavior. Finally, we end the article with an evolutionary perspective of how this non-goal-oriented behavior appeared. We acknowledge that our proposal, in its current form, is still far from explaining the emergence of understanding. Nonetheless, we set the ground for an alternative neuron-based framework to ultimately explain understanding.


Author(s):  
Moritz Wiese ◽  
Holger Boche

AbstractWe study security functions which can serve to establish semantic security for the two central problems of information-theoretic security: the wiretap channel, and privacy amplification for secret key generation. The security functions are functional forms of mosaics of combinatorial designs, more precisely, of group divisible designs and balanced incomplete block designs. Every member of a mosaic is associated with a unique color, and each color corresponds to a unique message or key value. Every block index of the mosaic corresponds to a public seed shared between the two trusted communicating parties. The seed set should be as small as possible. We give explicit examples which have an optimal or nearly optimal trade-off of seed length versus color (i.e., message or key) rate. We also derive bounds for the security performance of security functions given by functional forms of mosaics of designs.


2022 ◽  
Author(s):  
Swarnavo Sarkar ◽  
Jayan Rammohan

Living cells process information about their environment through the central dogma processes of transcription and translation, which drive the cellular response to stimuli. Here, we study the transfer of information from environmental input to the transcript and protein expression levels. Evaluation of both experimental and analogous simulation data reveals that transcription and translation are not two simple information channels connected in series. Instead, we show that the central dogma reactions often create a time-integrating information channel, where the translation channel receives and integrates multiple outputs from the transcription channel. This information channel model of the central dogma provides new information-theoretic selection criteria for the central dogma rate constants. Using the data for four well-studied species we show that their central dogma rate constants achieve information gain due to time integration while also keeping the loss due to stochasticity in translation relatively low (< 0.5 bits).


Sign in / Sign up

Export Citation Format

Share Document