scholarly journals Computational Creativity and Aesthetics with Algorithmic Information Theory

Entropy ◽  
2021 ◽  
Vol 23 (12) ◽  
pp. 1654
Author(s):  
Tiasa Mondol ◽  
Daniel G. Brown

We build an analysis based on the Algorithmic Information Theory of computational creativity and extend it to revisit computational aesthetics, thereby, improving on the existing efforts of its formulation. We discuss Kolmogorov complexity, models and randomness deficiency (which is a measure of how much a model falls short of capturing the regularities in an artifact) and show that the notions of typicality and novelty of a creative artifact follow naturally from such definitions. Other exciting formalizations of aesthetic measures include logical depth and sophistication with which we can define, respectively, the value and creator’s artistry present in a creative work. We then look at some related research that combines information theory and creativity and analyze them with the algorithmic tools that we develop throughout the paper. Finally, we assemble the ideas and their algorithmic counterparts to complete an algorithmic information theoretic recipe for computational creativity and aesthetics.

2010 ◽  
Vol 21 (03) ◽  
pp. 321-327 ◽  
Author(s):  
YEN-WU TI ◽  
CHING-LUEH CHANG ◽  
YUH-DAUH LYUU ◽  
ALEXANDER SHEN

A bit string is random (in the sense of algorithmic information theory) if it is incompressible, i.e., its Kolmogorov complexity is close to its length. Two random strings are independent if knowing one of them does not simplify the description of the other, i.e., the conditional complexity of each string (using the other as a condition) is close to its length. We may define independence of a k-tuple of strings in the same way. In this paper we address the following question: what is that maximal cardinality of a set of n-bit strings if any k elements of this set are independent (up to a certain constant)? Lower and upper bounds that match each other (with logarithmic precision) are provided.


Author(s):  
TUOMAS ORPONEN

Abstract Recently, Lutz and Stull used methods from algorithmic information theory to prove two new Marstrand-type projection theorems, concerning subsets of Euclidean space which are not assumed to be Borel, or even analytic. One of the theorems states that if \[K \subset {\mathbb{R}^n}\] is any set with equal Hausdorff and packing dimensions, then \begin{equation} \[{\dim _{\text{H}}}{\pi _e}(K) = \min \{ {\dim _{\text{H}}}{\text{ }}K{\text{, 1}}\} \] \end{equation} for almost every \[e \in {S^{n - 1}}\] . Here \[{\pi _e}\] stands for orthogonal projection to span ( \[e\] ). The primary purpose of this paper is to present proofs for Lutz and Stull’s projection theorems which do not refer to information theoretic concepts. Instead, they will rely on combinatorial-geometric arguments, such as discretised versions of Kaufman’s “potential theoretic” method, the pigeonhole principle, and a lemma of Katz and Tao. A secondary purpose is to generalise Lutz and Stull’s theorems: the versions in this paper apply to orthogonal projections to m-planes in \[{\mathbb{R}^n}\] , for all \[0 < m < n\] .


Entropy ◽  
2021 ◽  
Vol 23 (11) ◽  
pp. 1524
Author(s):  
Daniel G. Brown ◽  
Tiasa Mondol

We discuss how to assess computationally the aesthetic value of “small” objects, namely those that have short digital descriptions. Such small objects still matter: they include headlines, poems, song lyrics, short musical scripts and other culturally crucial items. Yet, small objects are a confounding case for our recent work adapting ideas from algorithmic information theory (AIT) to the domain of computational creativity, as they cannot be either logically deep or sophisticated following the traditional definitions of AIT. We show how restricting the class of models under analysis can make it the case that we can still separate high-quality small objects from ordinary ones, and discuss the strengths and limitations of our adaptation.


2020 ◽  
Vol 2 (1) ◽  
pp. 32-35
Author(s):  
Eric Holloway

Leonid Levin developed the first stochastic conservation of information law, describing it as "torturing an uninformed witness cannot give information about the crime."  Levin's law unifies both the deterministic and stochastic cases of conservation of information.  A proof of Levin's law from Algorithmic Information Theory is given as well as a discussion of its implications in evolutionary algorithms and fitness functions.


Sign in / Sign up

Export Citation Format

Share Document