On the formalisation of Kolmogorov complexity

Author(s):  
Elliot Catt ◽  
Michael Norrish
2020 ◽  
pp. 1-28
Author(s):  
NIKITA MORIAKOV

Abstract A theorem of Brudno says that the Kolmogorov–Sinai entropy of an ergodic subshift over $\mathbb {N}$ equals the asymptotic Kolmogorov complexity of almost every word in the subshift. The purpose of this paper is to extend this result to subshifts over computable groups that admit computable regular symmetric Følner monotilings, which we introduce in this work. For every $d \in \mathbb {N}$ , the groups $\mathbb {Z}^d$ and $\mathsf{UT}_{d+1}(\mathbb {Z})$ admit computable regular symmetric Følner monotilings for which the required computing algorithms are provided.


2007 ◽  
Vol 72 (3) ◽  
pp. 1003-1018 ◽  
Author(s):  
John Chisholm ◽  
Jennifer Chubb ◽  
Valentina S. Harizanov ◽  
Denis R. Hirschfeldt ◽  
Carl G. Jockusch ◽  
...  

AbstractWe study the weak truth-table and truth-table degrees of the images of subsets of computable structures under isomorphisms between computable structures. In particular, we show that there is a low c.e. set that is not weak truth-table reducible to any initial segment of any scattered computable linear ordering. Countable subsets of 2ω and Kolmogorov complexity play a major role in the proof.


Author(s):  
Alessandro Achille ◽  
Giovanni Paolini ◽  
Glen Mbeng ◽  
Stefano Soatto

Abstract We introduce an asymmetric distance in the space of learning tasks and a framework to compute their complexity. These concepts are foundational for the practice of transfer learning, whereby a parametric model is pre-trained for a task, and then fine tuned for another. The framework we develop is non-asymptotic, captures the finite nature of the training dataset and allows distinguishing learning from memorization. It encompasses, as special cases, classical notions from Kolmogorov complexity and Shannon and Fisher information. However, unlike some of those frameworks, it can be applied to large-scale models and real-world datasets. Our framework is the first to measure complexity in a way that accounts for the effect of the optimization scheme, which is critical in deep learning.


Sign in / Sign up

Export Citation Format

Share Document