algorithmic information
Recently Published Documents


TOTAL DOCUMENTS

195
(FIVE YEARS 65)

H-INDEX

17
(FIVE YEARS 2)

Entropy ◽  
2021 ◽  
Vol 23 (12) ◽  
pp. 1654
Author(s):  
Tiasa Mondol ◽  
Daniel G. Brown

We build an analysis based on the Algorithmic Information Theory of computational creativity and extend it to revisit computational aesthetics, thereby, improving on the existing efforts of its formulation. We discuss Kolmogorov complexity, models and randomness deficiency (which is a measure of how much a model falls short of capturing the regularities in an artifact) and show that the notions of typicality and novelty of a creative artifact follow naturally from such definitions. Other exciting formalizations of aesthetic measures include logical depth and sophistication with which we can define, respectively, the value and creator’s artistry present in a creative work. We then look at some related research that combines information theory and creativity and analyze them with the algorithmic tools that we develop throughout the paper. Finally, we assemble the ideas and their algorithmic counterparts to complete an algorithmic information theoretic recipe for computational creativity and aesthetics.


2021 ◽  
Author(s):  
Michael Cader Nelson

Every statistical estimate is equal to the sum of a nonrandom component, due to parameter values and bias, and a random component, due to sampling error. Estimation theory suggests that the two components are hopelessly confounded in the estimate. We would like to estimate the sign and magnitude of a statistic’s random deviation from its parameter--its accuracy--in the same way we quantify a statistic’s random variability around its parameter--its precision--by estimating the standard error. However, because the random component is an attribute of the sample data, it be described with parametric or Fisher information. In information theory, on the other hand, every information type--entropy, complexity--is understood as describing the extent of randomness in manifest data. This suggests that integrating the two conceptions of information could allow us to describe the two components of a statistical estimate, if only we could identify a common link between the two paradigms.The matching statistic, m, is such a link. For paired, ranked vectors X and Y of length n, m is the total number of paired observations in X and Y with matching ranks, m = Σ R(Xi) = R(Yi). That is, m is the number of fixed points between vectors. m has a long history in statistics, having served as the test statistic of a little-known null hypothesis statistical test (NHST) for the correlation coefficient, dating to around the turn of the twentieth century, called the matching method. Subtracting m from n yields a metric with a long history in information theory, the Hamming distance, a classic metric of the conditional complexity K(Y|X). Thus, m simultaneously contains both the Fisher information in a bivariate sample about the latent correlation and the conditional complexity or algorithmic information about the manifest observations.This paper shows that the presence of these two conflicting information types in m manifests a peculiar attribute in the statistic: m has an asymptotic efficiency less than or equal to zero relative to conventional correlation estimators computed on the same data. This means its Fisher information content decreases with increasing sample size, so that m’s random component is disproportionately large. Furthermore, when m and Pearson’s r are computed on the same sample, the two share a random component, and the value of m is indicative of the accuracy of r with respect to that component. Having proven this utility of m, by means theoretical and empirical (Monte Carlo simulations), additional matching statistics are constructed, including one composite statistic that is even more informative of the accuracy of r, and another that is indicative of the accuracy of Cohen’s d. Potential applications for computing accuracy-adjusted r are described, and implications are discussed.


Entropy ◽  
2021 ◽  
Vol 23 (11) ◽  
pp. 1524
Author(s):  
Daniel G. Brown ◽  
Tiasa Mondol

We discuss how to assess computationally the aesthetic value of “small” objects, namely those that have short digital descriptions. Such small objects still matter: they include headlines, poems, song lyrics, short musical scripts and other culturally crucial items. Yet, small objects are a confounding case for our recent work adapting ideas from algorithmic information theory (AIT) to the domain of computational creativity, as they cannot be either logically deep or sophisticated following the traditional definitions of AIT. We show how restricting the class of models under analysis can make it the case that we can still separate high-quality small objects from ordinary ones, and discuss the strengths and limitations of our adaptation.


2021 ◽  
Vol ahead-of-print (ahead-of-print) ◽  
Author(s):  
Jutta Haider ◽  
Olof Sundin

PurposeThe article makes an empirical and conceptual contribution to understanding the temporalities of information literacies. The paper aims to identify different ways in which anticipation of certain outcomes shapes strategies and tactics for engagement with algorithmic information intermediaries. The paper suggests that, given the dominance of predictive algorithms in society, information literacies need to be understood as sites of anticipation.Design/methodology/approachThe article explores the ways in which the invisible algorithms of information intermediaries are conceptualised, made sense of and challenged by young people in their everyday lives. This is couched in a conceptual discussion of the role of anticipation in understanding expressions of information literacies in algorithmic cultures. The empirical material drawn on consists of semi-structured, pair interviews with 61 17–19 year olds, carried out in Sweden and Denmark. The analysis is carried out by means of a qualitative thematic analysis in three steps and along two sensitising concepts – agency and temporality.FindingsThe results are presented through three themes, anticipating personalisation, divergences and interventions. These highlight how articulating an anticipatory stance works towards connecting individual responsibilities, collective responsibilities and corporate interests and thus potentially facilitating an understanding of information as co-constituted by the socio-material conditions that enable it. This has clear implications for the framing of information literacies in relation to algorithmic systems.Originality/valueThe notion of algo-rhythm awareness constitutes a novel contribution to the field. By centring the role of anticipation in the emergence of information literacies, the article advances understanding of the temporalities of information.


Sci ◽  
2021 ◽  
Vol 3 (4) ◽  
pp. 35
Author(s):  
Peter Verheyen

How does the world around us work and what is real? This question has preoccupied humanity since its beginnings. From the 16th century onwards, it has periodically been necessary to revise the prevailing worldview—but things became very strange at the beginning of the 20th century with the advent of relativity theory and quantum physics. The current focus is on the role of information, there being a debate about whether this is ontological or epistemological. A theory has recently been formulated in which spacetime and gravity emerges from microscopic quantum information—more specifically from quantum entanglement via entanglement entropy. A recent theory describes the emergence of reality itself through first-person perspective experiences and algorithmic information theory. In quantum physics, perception and observation play a central role. Perception of and interaction with the environment require an exchange of information. Via biochemical projection, information is given an interpretation that is necessary to make life and consciousness possible. The world around us is not at all what it seems.


2021 ◽  
Vol 13 (3) ◽  
pp. 1-15
Author(s):  
Neil Lutz

Algorithmic fractal dimensions quantify the algorithmic information density of individual points and may be defined in terms of Kolmogorov complexity. This work uses these dimensions to bound the classical Hausdorff and packing dimensions of intersections and Cartesian products of fractals in Euclidean spaces. This approach shows that two prominent, fundamental results about the dimension of Borel or analytic sets also hold for arbitrary sets.


Entropy ◽  
2021 ◽  
Vol 23 (10) ◽  
pp. 1288
Author(s):  
Sean Devine

According to Landauer’s principle, at least kBln2T Joules are needed to erase a bit that stores information in a thermodynamic system at temperature T. However, the arguments for the principle rely on a regime where the equipartition principle holds. This paper, by exploring a simple model of a thermodynamic system using algorithmic information theory, shows the energy cost of transferring a bit, or restoring the original state, is kBln2T Joules for a reversible system. The principle is a direct consequence of the statistics required to allocate energy between stored energy states and thermal states, and applies outside the validity of the equipartition principle. As the thermodynamic entropy of a system coincides with the algorithmic entropy of a typical state specifying the momentum degrees of freedom, it can quantify the thermodynamic requirements in terms of bit flows to maintain a system distant from the equilibrium set of states. The approach offers a simple conceptual understanding of entropy, while avoiding problems with the statistical mechanic’s approach to the second law of thermodynamics. Furthermore, the classical articulation of the principle can be used to derive the low temperature heat capacities, and is consistent with the quantum version of the principle.


Author(s):  
Peter Verheyen

How does the world around us work and what is real? This question has preoccupied humanity since its beginnings. From the 16th century onwards, it has been periodically necessary to revise the prevailing worldview. But things became very strange at the beginning of the 20th century with the advent of relativity theory and quantum physics. The current focus is on the role of information, there being a debate about whether this is ontological or epistemological. A theory has recently been formulated in which spacetime and gravity emerges from microscopic quantum information, more specifically from quantum entanglement via entanglement entropy. A latest theory describes the emergence of reality itself through first-person perspective experiences and algorithmic information theory. In quantum physics, perception and observation play a central role. Perception, interaction with the environment, requires an exchange of information. Via biochemical projection, information is given an interpretation that is necessary to make life and consciousness possible. The world around us is not at all what it seems.


Author(s):  
Peter Verheyen

How does the world around us work and what is real? This question has preoccupied humanity since its beginnings. From the 16th century onwards, it has been periodically necessary to revise the prevailing worldview. But things became very strange at the beginning of the 20th century with the advent of relativity theory and quantum physics. The current focus is on the role of information, there being a debate about whether this is ontological or epistemological. A theory has recently been formulated in which spacetime and gravity emerges from microscopic quantum information, more specifically from quantum entanglement via entanglement entropy. A latest theory describes the emergence of reality itself through first-person perspective experiences and algorithmic information theory. In quantum physics, perception and observation play a central role. Perception, interaction with the environment, requires an exchange of information. Via biochemical projection, information is given an interpretation that is necessary to make life and consciousness possible. The world around us is not at all what it seems.


Sign in / Sign up

Export Citation Format

Share Document