algorithmic information theory
Recently Published Documents


TOTAL DOCUMENTS

123
(FIVE YEARS 32)

H-INDEX

12
(FIVE YEARS 1)

Entropy ◽  
2021 ◽  
Vol 23 (12) ◽  
pp. 1654
Author(s):  
Tiasa Mondol ◽  
Daniel G. Brown

We build an analysis based on the Algorithmic Information Theory of computational creativity and extend it to revisit computational aesthetics, thereby, improving on the existing efforts of its formulation. We discuss Kolmogorov complexity, models and randomness deficiency (which is a measure of how much a model falls short of capturing the regularities in an artifact) and show that the notions of typicality and novelty of a creative artifact follow naturally from such definitions. Other exciting formalizations of aesthetic measures include logical depth and sophistication with which we can define, respectively, the value and creator’s artistry present in a creative work. We then look at some related research that combines information theory and creativity and analyze them with the algorithmic tools that we develop throughout the paper. Finally, we assemble the ideas and their algorithmic counterparts to complete an algorithmic information theoretic recipe for computational creativity and aesthetics.


Entropy ◽  
2021 ◽  
Vol 23 (11) ◽  
pp. 1524
Author(s):  
Daniel G. Brown ◽  
Tiasa Mondol

We discuss how to assess computationally the aesthetic value of “small” objects, namely those that have short digital descriptions. Such small objects still matter: they include headlines, poems, song lyrics, short musical scripts and other culturally crucial items. Yet, small objects are a confounding case for our recent work adapting ideas from algorithmic information theory (AIT) to the domain of computational creativity, as they cannot be either logically deep or sophisticated following the traditional definitions of AIT. We show how restricting the class of models under analysis can make it the case that we can still separate high-quality small objects from ordinary ones, and discuss the strengths and limitations of our adaptation.


Sci ◽  
2021 ◽  
Vol 3 (4) ◽  
pp. 35
Author(s):  
Peter Verheyen

How does the world around us work and what is real? This question has preoccupied humanity since its beginnings. From the 16th century onwards, it has periodically been necessary to revise the prevailing worldview—but things became very strange at the beginning of the 20th century with the advent of relativity theory and quantum physics. The current focus is on the role of information, there being a debate about whether this is ontological or epistemological. A theory has recently been formulated in which spacetime and gravity emerges from microscopic quantum information—more specifically from quantum entanglement via entanglement entropy. A recent theory describes the emergence of reality itself through first-person perspective experiences and algorithmic information theory. In quantum physics, perception and observation play a central role. Perception of and interaction with the environment require an exchange of information. Via biochemical projection, information is given an interpretation that is necessary to make life and consciousness possible. The world around us is not at all what it seems.


Entropy ◽  
2021 ◽  
Vol 23 (10) ◽  
pp. 1288
Author(s):  
Sean Devine

According to Landauer’s principle, at least kBln2T Joules are needed to erase a bit that stores information in a thermodynamic system at temperature T. However, the arguments for the principle rely on a regime where the equipartition principle holds. This paper, by exploring a simple model of a thermodynamic system using algorithmic information theory, shows the energy cost of transferring a bit, or restoring the original state, is kBln2T Joules for a reversible system. The principle is a direct consequence of the statistics required to allocate energy between stored energy states and thermal states, and applies outside the validity of the equipartition principle. As the thermodynamic entropy of a system coincides with the algorithmic entropy of a typical state specifying the momentum degrees of freedom, it can quantify the thermodynamic requirements in terms of bit flows to maintain a system distant from the equilibrium set of states. The approach offers a simple conceptual understanding of entropy, while avoiding problems with the statistical mechanic’s approach to the second law of thermodynamics. Furthermore, the classical articulation of the principle can be used to derive the low temperature heat capacities, and is consistent with the quantum version of the principle.


Author(s):  
Peter Verheyen

How does the world around us work and what is real? This question has preoccupied humanity since its beginnings. From the 16th century onwards, it has been periodically necessary to revise the prevailing worldview. But things became very strange at the beginning of the 20th century with the advent of relativity theory and quantum physics. The current focus is on the role of information, there being a debate about whether this is ontological or epistemological. A theory has recently been formulated in which spacetime and gravity emerges from microscopic quantum information, more specifically from quantum entanglement via entanglement entropy. A latest theory describes the emergence of reality itself through first-person perspective experiences and algorithmic information theory. In quantum physics, perception and observation play a central role. Perception, interaction with the environment, requires an exchange of information. Via biochemical projection, information is given an interpretation that is necessary to make life and consciousness possible. The world around us is not at all what it seems.


Author(s):  
Peter Verheyen

How does the world around us work and what is real? This question has preoccupied humanity since its beginnings. From the 16th century onwards, it has been periodically necessary to revise the prevailing worldview. But things became very strange at the beginning of the 20th century with the advent of relativity theory and quantum physics. The current focus is on the role of information, there being a debate about whether this is ontological or epistemological. A theory has recently been formulated in which spacetime and gravity emerges from microscopic quantum information, more specifically from quantum entanglement via entanglement entropy. A latest theory describes the emergence of reality itself through first-person perspective experiences and algorithmic information theory. In quantum physics, perception and observation play a central role. Perception, interaction with the environment, requires an exchange of information. Via biochemical projection, information is given an interpretation that is necessary to make life and consciousness possible. The world around us is not at all what it seems.


Author(s):  
Peter Verheyen

How does the world around us work and what is real? This question has preoccupied humanity since its beginnings. From the 16th century onwards, it has been periodically necessary to revise the prevailing worldview. But things became very strange at the beginning of the 20th century with the advent of relativity theory and quantum physics. The current focus is on the role of information, there being a debate about whether this is ontological or epistemological. A theory has recently been formulated in which spacetime and gravity emerges from microscopic quantum information, more specifically from quantum entanglement via entanglement entropy. A latest theory describes the emergence of reality itself through first-person perspective experiences and algorithmic information theory. In quantum physics, perception and observation play a central role. Perception, interaction with the environment, requires an exchange of information. Via biochemical projection, information is given an interpretation that is necessary to make life and consciousness possible. The world around us is not at all what it seems.


Entropy ◽  
2021 ◽  
Vol 23 (7) ◽  
pp. 835
Author(s):  
Felipe S. Abrahão ◽  
Klaus Wehmuth ◽  
Hector Zenil ◽  
Artur Ziviani

In this article, we investigate limitations of importing methods based on algorithmic information theory from monoplex networks into multidimensional networks (such as multilayer networks) that have a large number of extra dimensions (i.e., aspects). In the worst-case scenario, it has been previously shown that node-aligned multidimensional networks with non-uniform multidimensional spaces can display exponentially larger algorithmic information (or lossless compressibility) distortions with respect to their isomorphic monoplex networks, so that these distortions grow at least linearly with the number of extra dimensions. In the present article, we demonstrate that node-unaligned multidimensional networks, either with uniform or non-uniform multidimensional spaces, can also display exponentially larger algorithmic information distortions with respect to their isomorphic monoplex networks. However, unlike the node-aligned non-uniform case studied in previous work, these distortions in the node-unaligned case grow at least exponentially with the number of extra dimensions. On the other hand, for node-aligned multidimensional networks with uniform multidimensional spaces, we demonstrate that any distortion can only grow up to a logarithmic order of the number of extra dimensions. Thus, these results establish that isomorphisms between finite multidimensional networks and finite monoplex networks do not preserve algorithmic information in general and highlight that the algorithmic information of the multidimensional space itself needs to be taken into account in multidimensional network complexity analysis.


Author(s):  
Felipe S. Abrahão ◽  
Klaus Wehmuth ◽  
Hector Zenil ◽  
Artur Ziviani

In this article, we investigate limitations of importing methods based on algorithmic information theory from monoplex networks into multidimensional networks (such as multilayer networks) that have a large number of extra dimensions (i.e., aspects). In the worst-case scenario, it has been previously shown that node-aligned multidimensional networks with non-uniform multidimensional spaces can display exponentially larger algorithmic information (or lossless compressibility) distortions with respect to their isomorphic monoplex networks, so that these distortions grow at least linearly with the number of extra dimensions. In the present article, we demonstrate that node-unaligned multidimensional networks, either with uniform or non-uniform multidimensional spaces, can also display exponentially larger algorithmic information distortions with respect to their isomorphic monoplex networks. However, unlike the node-aligned non-uniform case studied in previous work, these distortions in the node-unaligned case grow at least exponentially with the number of extra dimensions. On the other hand, for node-aligned multidimensional networks with uniform multidimensional spaces, we demonstrate that any distortion can only grow up to a logarithmic order of the number of extra dimensions. Thus, these results establish that isomorphisms between finite multidimensional networks and finite monoplex networks do not preserve algorithmic information in general and highlight that the algorithmic information of the multidimensional space itself needs to be taken into account in multidimensional network complexity analysis.


Sign in / Sign up

Export Citation Format

Share Document