scholarly journals On the existence of maximizing measures for irreducible countable Markov shifts: a dynamical proof

2013 ◽  
Vol 34 (4) ◽  
pp. 1103-1115 ◽  
Author(s):  
RODRIGO BISSACOT ◽  
RICARDO DOS SANTOS FREIRE

AbstractWe prove that if ${\Sigma }_{\mathbf{A} } ( \mathbb{N} )$ is an irreducible Markov shift space over $ \mathbb{N} $ and $f: {\Sigma }_{\mathbf{A} } ( \mathbb{N} )\rightarrow \mathbb{R} $ is coercive with bounded variation then there exists a maximizing probability measure for $f$, whose support lies on a Markov subshift over a finite alphabet. Furthermore, the support of any maximizing measure is contained in this same compact subshift. To the best of our knowledge, this is the first proof beyond the finitely primitive case in the general irreducible non-compact setting. It is also noteworthy that our technique works for the full shift over positive real sequences.

1997 ◽  
Vol 08 (03) ◽  
pp. 357-374 ◽  
Author(s):  
Kengo Matsumoto

We construct and study C*-algebras associated with subshifts in symbolic dynamics as a generalization of Cuntz–Krieger algebras for topological Markov shifts. We prove some universal properties for the C*-algebras and give a criterion for them to be simple and purely infinite. We also present an example of a C*-algebra coming from a subshift which is not conjugate to a Markov shift.


2008 ◽  
Vol 22 (1/2, September) ◽  
pp. 131-164 ◽  
Author(s):  
Manuel Stadlbauer ◽  
Yuri Kifer ◽  
Manfred Denker

Author(s):  
Tom Leinster ◽  
Emily Roff

Abstract We define a one-parameter family of entropies, each assigning a real number to any probability measure on a compact metric space (or, more generally, a compact Hausdorff space with a notion of similarity between points). These generalize the Shannon and Rényi entropies of information theory. We prove that on any space X, there is a single probability measure maximizing all these entropies simultaneously. Moreover, all the entropies have the same maximum value: the maximum entropy of X. As X is scaled up, the maximum entropy grows, and its asymptotics determine geometric information about X, including the volume and dimension. And the large-scale limit of the maximizing measure itself provides an answer to the question: what is the canonical measure on a metric space? Primarily, we work not with entropy itself but its exponential, which in its finite form is already in use as a measure of biodiversity. Our main theorem was first proved in the finite case by Leinster and Meckes.


2018 ◽  
Vol 39 (10) ◽  
pp. 2593-2618 ◽  
Author(s):  
OLIVER JENKINSON

Ergodic optimization is the study of problems relating to maximizing orbits and invariant measures, and maximum ergodic averages. An orbit of a dynamical system is called$f$-maximizing if the time average of the real-valued function$f$along the orbit is larger than along all other orbits, and an invariant probability measure is called$f$-maximizing if it gives$f$a larger space average than any other invariant probability measure. In this paper, we consider the main strands of ergodic optimization, beginning with an influential model problem, and the interpretation of ergodic optimization as the zero temperature limit of thermodynamic formalism. We describe typical properties of maximizing measures for various spaces of functions, the key tool of adding a coboundary so as to reveal properties of these measures, as well as certain classes of functions where the maximizing measure is known to be Sturmian.


Sign in / Sign up

Export Citation Format

Share Document