Kolmogorov Complexity in Perspective Part I: Information Theory and Randomness

Author(s):  
Marie Ferbus-Zanda ◽  
Serge Grigorieff
2010 ◽  
Vol 21 (03) ◽  
pp. 321-327 ◽  
Author(s):  
YEN-WU TI ◽  
CHING-LUEH CHANG ◽  
YUH-DAUH LYUU ◽  
ALEXANDER SHEN

A bit string is random (in the sense of algorithmic information theory) if it is incompressible, i.e., its Kolmogorov complexity is close to its length. Two random strings are independent if knowing one of them does not simplify the description of the other, i.e., the conditional complexity of each string (using the other as a condition) is close to its length. We may define independence of a k-tuple of strings in the same way. In this paper we address the following question: what is that maximal cardinality of a set of n-bit strings if any k elements of this set are independent (up to a certain constant)? Lower and upper bounds that match each other (with logarithmic precision) are provided.


2015 ◽  
Vol 21 (2) ◽  
pp. 205-224 ◽  
Author(s):  
Leong Ting Lui ◽  
Germán Terrazas ◽  
Hector Zenil ◽  
Cameron Alexander ◽  
Natalio Krasnogor

In the past decades many definitions of complexity have been proposed. Most of these definitions are based either on Shannon's information theory or on Kolmogorov complexity; these two are often compared, but very few studies integrate the two ideas. In this article we introduce a new measure of complexity that builds on both of these theories. As a demonstration of the concept, the technique is applied to elementary cellular automata and simulations of the self-organization of porphyrin molecules.


2018 ◽  
Author(s):  
Kehinde Owoeye ◽  
Mirco Musolesi ◽  
Stephen Hailes

AbstractUnderstanding the movement patterns of animals across different spatio-temporal scales, conditions, habitats and contexts is becoming increasingly important for addressing a series of questions in animal behaviour studies, such as mapping migration routes, evaluating resource use, modelling epidemic spreading in a population, developing strategies for animal conservation as well as understanding several emerging patterns related to feeding, growth and reproduction. In recent times, information theory has been successfully applied in several fields of science, in particular for understanding the dynamics of complex systems and characterizing adaptive social systems, such as dynamics of entities as individuals and as part of groups.In this paper, we describe a series of non-parametric information-theoretic measures that can be used to derive new insights about animal behaviour with a specific focus on movement patterns namely Shannon entropy, Mutual information, Kullback-Leibler divergence and Kolmogorov complexity. In particular, we believe that the metrics presented in this paper can be used to formulate new hypotheses that can be verified potentially through a set of different observations. We show how these measures can be used to characterize the movement patterns of several animals across different habitats and scales. Specifically, we show the effectiveness in using Shannon entropy to characterize the movement of sheep with Batten disease, mutual information to measure association in pigeons, Kullback Leibler divergence to study the flights of Turkey vulture, and Kolmogorov complexity to find similarities in the movement patterns of animals across different scales and habitats. Finally, we discuss the limitations of these methods and we outline the challenges in this research area.


Author(s):  
Francisco Torrens ◽  
Gloria Castellano

Numerous definitions for complexity have been proposed with little consensus. The definition here is related to Kolmogorov complexity and Shannon entropy measures. However, the price is to introduce context dependence into the definition of complexity. Such context dependence is an inherent property of complexity. Scientists are uncomfortable with such context dependence that smacks of subjectivity, which is the reason why little agreement is found on the meaning of the terms. In an article published in Molecules, Lin presented a novel approach for assessing molecular diversity based on Shannon information theory. A set of compounds is viewed as a static collection of microstates that can register information about their environment. The method is characterized by a strong tendency to oversample remote areas of the feature space and produce unbalanced designs. This chapter demonstrates the limitation with some simple examples and provides a rationale for the failure to produce results that are consistent.


Entropy ◽  
2021 ◽  
Vol 23 (12) ◽  
pp. 1654
Author(s):  
Tiasa Mondol ◽  
Daniel G. Brown

We build an analysis based on the Algorithmic Information Theory of computational creativity and extend it to revisit computational aesthetics, thereby, improving on the existing efforts of its formulation. We discuss Kolmogorov complexity, models and randomness deficiency (which is a measure of how much a model falls short of capturing the regularities in an artifact) and show that the notions of typicality and novelty of a creative artifact follow naturally from such definitions. Other exciting formalizations of aesthetic measures include logical depth and sophistication with which we can define, respectively, the value and creator’s artistry present in a creative work. We then look at some related research that combines information theory and creativity and analyze them with the algorithmic tools that we develop throughout the paper. Finally, we assemble the ideas and their algorithmic counterparts to complete an algorithmic information theoretic recipe for computational creativity and aesthetics.


Author(s):  
Abbas El Gamal ◽  
Young-Han Kim

Author(s):  
Mark Kelbert ◽  
Yuri Suhov
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document