Souriau-Casimir Lie Groups Thermodynamics and Machine Learning

Author(s):  
Frédéric Barbaresco
Keyword(s):  
Entropy ◽  
2018 ◽  
Vol 20 (11) ◽  
pp. 840 ◽  
Author(s):  
Frédéric Barbaresco

We introduce poly-symplectic extension of Souriau Lie groups thermodynamics based on higher-order model of statistical physics introduced by Ingarden. This extended model could be used for small data analytics and machine learning on Lie groups. Souriau geometric theory of heat is well adapted to describe density of probability (maximum entropy Gibbs density) of data living on groups or on homogeneous manifolds. For small data analytics (rarified gases, sparse statistical surveys, …), the density of maximum entropy should consider higher order moments constraints (Gibbs density is not only defined by first moment but fluctuations request 2nd order and higher moments) as introduced by Ingarden. We use a poly-sympletic model introduced by Christian Günther, replacing the symplectic form by a vector-valued form. The poly-symplectic approach generalizes the Noether theorem, the existence of moment mappings, the Lie algebra structure of the space of currents, the (non-)equivariant cohomology and the classification of G-homogeneous systems. The formalism is covariant, i.e., no special coordinates or coordinate systems on the parameter space are used to construct the Hamiltonian equations. We underline the contextures of these models, and the process to build these generic structures. We also introduce a more synthetic Koszul definition of Fisher Metric, based on the Souriau model, that we name Souriau-Fisher metric. This Lie groups thermodynamics is the bedrock for Lie group machine learning providing a full covariant maximum entropy Gibbs density based on representation theory (symplectic structure of coadjoint orbits for Souriau non-equivariant model associated to a class of co-homology).


2020 ◽  
Vol 43 ◽  
Author(s):  
Myrthe Faber

Abstract Gilead et al. state that abstraction supports mental travel, and that mental travel critically relies on abstraction. I propose an important addition to this theoretical framework, namely that mental travel might also support abstraction. Specifically, I argue that spontaneous mental travel (mind wandering), much like data augmentation in machine learning, provides variability in mental content and context necessary for abstraction.


2020 ◽  
Author(s):  
Man-Wai Mak ◽  
Jen-Tzung Chien

2020 ◽  
Author(s):  
Mohammed J. Zaki ◽  
Wagner Meira, Jr
Keyword(s):  

2020 ◽  
Author(s):  
Marc Peter Deisenroth ◽  
A. Aldo Faisal ◽  
Cheng Soon Ong
Keyword(s):  

Author(s):  
Josi A. de Azcárraga ◽  
Josi M. Izquierdo
Keyword(s):  

Author(s):  
Lorenza Saitta ◽  
Attilio Giordana ◽  
Antoine Cornuejols

Author(s):  
Shai Shalev-Shwartz ◽  
Shai Ben-David
Keyword(s):  

Sign in / Sign up

Export Citation Format

Share Document