A Tight Upper Bound on Kolmogorov Complexity and Uniformly Optimal Prediction

1998 ◽  
Vol 31 (3) ◽  
pp. 215-229 ◽  
Author(s):  
L. Staiger
Entropy ◽  
2021 ◽  
Vol 23 (12) ◽  
pp. 1604
Author(s):  
Amirmohammad Farzaneh ◽  
Justin P. Coon ◽  
Mihai-Alin Badiu

Throughout the years, measuring the complexity of networks and graphs has been of great interest to scientists. The Kolmogorov complexity is known as one of the most important tools to measure the complexity of an object. We formalized a method to calculate an upper bound for the Kolmogorov complexity of graphs and networks. Firstly, the most simple graphs possible, those with O(1) Kolmogorov complexity, were identified. These graphs were then used to develop a method to estimate the complexity of a given graph. The proposed method utilizes the simple structures within a graph to capture its non-randomness. This method is able to capture features that make a network closer to the more non-random end of the spectrum. The resulting algorithm takes a graph as an input and outputs an upper bound to its Kolmogorov complexity. This could be applicable in, for example evaluating the performances of graph compression methods.


2014 ◽  
Vol 79 (2) ◽  
pp. 620-632 ◽  
Author(s):  
B. BAUWENS ◽  
A. SHEN

AbstractPéter Gács showed (Gács 1974) that for every n there exists a bit string x of length n whose plain complexity C(x) has almost maximal conditional complexity relative to x, i.e., $C\left( {C\left( x \right)|x} \right) \ge {\rm{log}}n - {\rm{log}}^{\left( 2 \right)} n - O\left( 1 \right)$ (Here ${\rm{log}}^{\left( 2 \right)} i = {\rm{loglog}}i$.) Following Elena Kalinina (Kalinina 2011), we provide a simple game-based proof of this result; modifying her argument, we get a better (and tight) bound ${\rm{log}}n - O\left( 1 \right)$ We also show the same bound for prefix-free complexity.Robert Solovay showed (Solovay 1975) that infinitely many strings x have maximal plain complexity but not maximal prefix complexity (among the strings of the same length): for some c there exist infinitely many x such that $|x| - C\left( x \right) \le c$ and $|x| + K\left( {|x|} \right) - K\left( x \right) \ge {\rm{log}}^{\left( 2 \right)} |x| - c{\rm{log}}^{\left( 3 \right)} |x|$ In fact, the results of Solovay and Gács are closely related. Using the result above, we provide a short proof for Solovay’s result. We also generalize it by showing that for some c and for all n there are strings x of length n with $n - C\left( x \right) \le c$ and $n + K\left( n \right) - K\left( x \right) \ge K\left( {K\left( n \right)|n} \right) - 3K\left( {K\left( {K\left( n \right)|n} \right)|n} \right) - c.$ We also prove a close upper bound $K\left( {K\left( n \right)|n} \right) + O\left( 1 \right)$Finally, we provide a direct game proof for Joseph Miller’s generalization (Miller 2006) of the same Solovay’s theorem: if a co-enumerable set (a set with c.e. complement) contains for every length a string of this length, then it contains infinitely many strings x such that$|x| + K\left( {|x|} \right) - K\left( x \right) \ge {\rm{log}}^{\left( 2 \right)} |x| - O\left( {{\rm{log}}^{\left( 3 \right)} |x|} \right).$


2021 ◽  
Vol 13 (1) ◽  
pp. 1-20
Author(s):  
Andris Ambainis ◽  
Martins Kokainis ◽  
Krišjānis Prūsis ◽  
Jevgēnijs Vihrovs ◽  
Aleksejs Zajakins

We show that all known classical adversary lower bounds on randomized query complexity are equivalent for total functions and are equal to the fractional block sensitivity fbs( f ). That includes the Kolmogorov complexity bound of Laplante and Magniez and the earlier relational adversary bound of Aaronson. This equivalence also implies that for total functions, the relational adversary is equivalent to a simpler lower bound, which we call rank-1 relational adversary. For partial functions, we show unbounded separations between fbs( f ) and other adversary bounds, as well as between the adversary bounds themselves. We also show that, for partial functions, fractional block sensitivity cannot give lower bounds larger than √ n ⋅ bs( f ), where n is the number of variables and bs( f ) is the block sensitivity. Then, we exhibit a partial function f that matches this upper bound, fbs( f ) = Ω (√ n ⋅ bs( f )).


Sign in / Sign up

Export Citation Format

Share Document