scholarly journals Thermodynamic Formalism in Neuronal Dynamics and Spike Train Statistics

Author(s):  
Rodrigo Cofré ◽  
Cesar Maldonado ◽  
Bruno Cessac

The Thermodynamic Formalism provides a rigorous mathematical framework to study quantitative and qualitative aspects of dynamical systems. At its core there is a variational principle and corresponding, in its simplest form, to the Maximum Entropy principle, used as a statistical inference procedure to represent, by specific probability measures (Gibbs measures), the collective behaviour of complex systems. This framework has found applications in different domains of scienThe Thermodynamic Formalism provides a rigorous mathematical framework to study quantitative and qualitative aspects of dynamical systems. At its core there is a variational principle and corresponding, in its simplest form, to the Maximum Entropy principle, used as a statistical inference procedure to represent, by specific probability measures (Gibbs measures), the collective behaviour of complex systems. This framework has found applications in different domains of science, in particular, has been fruitful and influential in neurosciences. In this article, we review how the Thermodynamic Formalism can be exploited in the field of theoretical neuroscience, as a conceptual and operational tool, to link the dynamics of interacting neurons and the statistics of action potentials from either experimental data or mathematical models. We comment on perspectives and open problems in theoretical neuroscience that could be addressed within this formalism.ce, in particular, has been fruitful and influential in neurosciences. In this article, we review how the Thermodynamic Formalism can be exploited in the field of theoretical neuroscience, as a conceptual and operational tool, to link the dynamics of interacting neurons and the statistics of action potentials from either experimental data or mathematical models. We comment on perspectives and open problems in theoretical neuroscience that could be addressed within this formalism.

Entropy ◽  
2020 ◽  
Vol 22 (11) ◽  
pp. 1330
Author(s):  
Rodrigo Cofré ◽  
Cesar Maldonado ◽  
Bruno Cessac

The Thermodynamic Formalism provides a rigorous mathematical framework for studying quantitative and qualitative aspects of dynamical systems. At its core, there is a variational principle that corresponds, in its simplest form, to the Maximum Entropy principle. It is used as a statistical inference procedure to represent, by specific probability measures (Gibbs measures), the collective behaviour of complex systems. This framework has found applications in different domains of science. In particular, it has been fruitful and influential in neurosciences. In this article, we review how the Thermodynamic Formalism can be exploited in the field of theoretical neuroscience, as a conceptual and operational tool, in order to link the dynamics of interacting neurons and the statistics of action potentials from either experimental data or mathematical models. We comment on perspectives and open problems in theoretical neuroscience that could be addressed within this formalism.


2010 ◽  
Vol 22 (10) ◽  
pp. 1147-1179 ◽  
Author(s):  
LUIS BARREIRA

This is a survey on recent developments concerning a thermodynamic formalism for almost additive sequences of functions. While the nonadditive thermodynamic formalism applies to much more general sequences, at the present stage of the theory there are no general results concerning, for example, a variational principle for the topological pressure or the existence of equilibrium or Gibbs measures (at least without further restrictive assumptions). On the other hand, in the case of almost additive sequences, it is possible to establish a variational principle and to discuss the existence and uniqueness of equilibrium and Gibbs measures, among several other results. After presenting in a self-contained manner the foundations of the theory, the survey includes the description of three applications of the almost additive thermodynamic formalism: a multifractal analysis of Lyapunov exponents for a class of nonconformal repellers; a conditional variational principle for limits of almost additive sequences; and the study of dimension spectra that consider simultaneously limits into the future and into the past.


1990 ◽  
Vol 27 (2) ◽  
pp. 303-313 ◽  
Author(s):  
Claudine Robert

The maximum entropy principle is used to model uncertainty by a maximum entropy distribution, subject to some appropriate linear constraints. We give an entropy concentration theorem (whose demonstration is based on large deviation techniques) which is a mathematical justification of this statistical modelling principle. Then we indicate how it can be used in artificial intelligence, and how relevant prior knowledge is provided by some classical descriptive statistical methods. It appears furthermore that the maximum entropy principle yields to a natural binding between descriptive methods and some statistical structures.


Author(s):  
KAI YAO ◽  
JINWU GAO ◽  
WEI DAI

Entropy is a measure of the uncertainty associated with a variable whose value cannot be exactly predicated. In uncertainty theory, it has been quantified so far by logarithmic entropy. However, logarithmic entropy sometimes fails to measure the uncertainty. This paper will propose another type of entropy named sine entropy as a supplement, and explore its properties. After that, the maximum entropy principle will be introduced, and the arc-cosine distributed variables will be proved to have the maximum sine entropy with given expected value and variance.


Sign in / Sign up

Export Citation Format

Share Document