scholarly journals Two-dimensional Kolmogorov complexity and an empirical validation of the Coding theorem method by compressibility

2015 ◽  
Vol 1 ◽  
pp. e23 ◽  
Author(s):  
Hector Zenil ◽  
Fernando Soler-Toscano ◽  
Jean-Paul Delahaye ◽  
Nicolas Gauvrit

We propose a measure based upon the fundamental theoretical concept in algorithmic information theory that provides a natural approach to the problem of evaluatingn-dimensional complexity by using ann-dimensional deterministic Turing machine. The technique is interesting because it provides a natural algorithmic process for symmetry breaking generating complexn-dimensional structures from perfectly symmetric and fully deterministic computational rules producing a distribution of patterns as described by algorithmic probability. Algorithmic probability also elegantly connects the frequency of occurrence of a pattern with its algorithmic complexity, hence effectively providing estimations to the complexity of the generated patterns. Experiments to validate estimations of algorithmic complexity based on these concepts are presented, showing that the measure is stable in the face of some changes in computational formalism and that results are in agreement with the results obtained using lossless compression algorithms when both methods overlap in their range of applicability. We then use the output frequency of the set of 2-dimensional Turing machines to classify the algorithmic complexity of the space-time evolutions of Elementary Cellular Automata.

Entropy ◽  
2018 ◽  
Vol 20 (7) ◽  
pp. 534 ◽  
Author(s):  
Hector Zenil ◽  
Narsis Kiani ◽  
Jesper Tegnér

We introduce a definition of algorithmic symmetry in the context of geometric and spatial complexity able to capture mathematical aspects of different objects using as a case study polyominoes and polyhedral graphs. We review, study and apply a method for approximating the algorithmic complexity (also known as Kolmogorov–Chaitin complexity) of graphs and networks based on the concept of Algorithmic Probability (AP). AP is a concept (and method) capable of recursively enumerate all properties of computable (causal) nature beyond statistical regularities. We explore the connections of algorithmic complexity—both theoretical and numerical—with geometric properties mainly symmetry and topology from an (algorithmic) information-theoretic perspective. We show that approximations to algorithmic complexity by lossless compression and an Algorithmic Probability-based method can characterize spatial, geometric, symmetric and topological properties of mathematical objects and graphs.


Entropy ◽  
2020 ◽  
Vol 22 (6) ◽  
pp. 612 ◽  
Author(s):  
Hector Zenil

Some established and also novel techniques in the field of applications of algorithmic (Kolmogorov) complexity currently co-exist for the first time and are here reviewed, ranging from dominant ones such as statistical lossless compression to newer approaches that advance, complement and also pose new challenges and may exhibit their own limitations. Evidence suggesting that these different methods complement each other for different regimes is presented and despite their many challenges, some of these methods can be better motivated by and better grounded in the principles of algorithmic information theory. It will be explained how different approaches to algorithmic complexity can explore the relaxation of different necessary and sufficient conditions in their pursuit of numerical applicability, with some of these approaches entailing greater risks than others in exchange for greater relevance. We conclude with a discussion of possible directions that may or should be taken into consideration to advance the field and encourage methodological innovation, but more importantly, to contribute to scientific discovery. This paper also serves as a rebuttal of claims made in a previously published minireview by another author, and offers an alternative account.


Author(s):  
Aritra Sarkar ◽  
Zaid Al-Ars ◽  
Koen Bertels

In this article we explore the limiting behavior of the universal prior distribution obtained when applied over multiple meta-level hierarchy of programs and output data of a computational automata model. We were motivated to alleviate the effect of Solomonoff's assumption that all computable functions or hypotheses of the same length are equally likely, by weighing each program in turn by the algorithmic probability of their description number encoding. In the limiting case, the set of all possible program strings of a fixed-length converges to a distribution of self-replicating quines and quine-relays - having the structure of a constructor. We discuss how experimental algorithmic information theory provides insights towards understanding the fundamental metrics proposed in this work and reflect on the significance of these result in digital physics and the constructor theory of life.


2020 ◽  
Vol 2 (1) ◽  
pp. 32-35
Author(s):  
Eric Holloway

Leonid Levin developed the first stochastic conservation of information law, describing it as "torturing an uninformed witness cannot give information about the crime."  Levin's law unifies both the deterministic and stochastic cases of conservation of information.  A proof of Levin's law from Algorithmic Information Theory is given as well as a discussion of its implications in evolutionary algorithms and fitness functions.


Sci ◽  
2021 ◽  
Vol 3 (4) ◽  
pp. 35
Author(s):  
Peter Verheyen

How does the world around us work and what is real? This question has preoccupied humanity since its beginnings. From the 16th century onwards, it has periodically been necessary to revise the prevailing worldview—but things became very strange at the beginning of the 20th century with the advent of relativity theory and quantum physics. The current focus is on the role of information, there being a debate about whether this is ontological or epistemological. A theory has recently been formulated in which spacetime and gravity emerges from microscopic quantum information—more specifically from quantum entanglement via entanglement entropy. A recent theory describes the emergence of reality itself through first-person perspective experiences and algorithmic information theory. In quantum physics, perception and observation play a central role. Perception of and interaction with the environment require an exchange of information. Via biochemical projection, information is given an interpretation that is necessary to make life and consciousness possible. The world around us is not at all what it seems.


Author(s):  
Peter Verheyen

How does the world around us work and what is real? This question has preoccupied humanity since its beginnings. From the 16th century onwards, it has been periodically necessary to revise the prevailing worldview. But things became very strange at the beginning of the 20th century with the advent of relativity theory and quantum physics. The current focus is on the role of information, there being a debate about whether this is ontological or epistemological. A theory has recently been formulated in which spacetime and gravity emerges from microscopic quantum information, more specifically from quantum entanglement via entanglement entropy. A latest theory describes the emergence of reality itself through first-person perspective experiences and algorithmic information theory. In quantum physics, perception and observation play a central role. Perception, interaction with the environment, requires an exchange of information. Via biochemical projection, information is given an interpretation that is necessary to make life and consciousness possible. The world around us is not at all what it seems.


Sign in / Sign up

Export Citation Format

Share Document