scholarly journals On the Geodesic Distance in Shapes K-means Clustering

Entropy ◽  
2018 ◽  
Vol 20 (9) ◽  
pp. 647 ◽  
Author(s):  
Stefano Gattone ◽  
Angela De Sanctis ◽  
Stéphane Puechmorel ◽  
Florence Nicol

In this paper, the problem of clustering rotationally invariant shapes is studied and a solution using Information Geometry tools is provided. Landmarks of a complex shape are defined as probability densities in a statistical manifold. Then, in the setting of shapes clustering through a K-means algorithm, the discriminative power of two different shapes distances are evaluated. The first, derived from Fisher–Rao metric, is related with the minimization of information in the Fisher sense and the other is derived from the Wasserstein distance which measures the minimal transportation cost. A modification of the K-means algorithm is also proposed which allows the variances to vary not only among the landmarks but also among the clusters.

Entropy ◽  
2020 ◽  
Vol 22 (1) ◽  
pp. 94
Author(s):  
Kaixuan Du ◽  
Pin Wan ◽  
Yonghua Wang ◽  
Xiongzhi Ai ◽  
Huang Chen

Due to the scarcity of radio spectrum resources and the growing demand, the use of spectrum sensing technology to improve the utilization of spectrum resources has become a hot research topic. In order to improve the utilization of spectrum resources, this paper proposes a spectrum sensing method that combines information geometry and deep learning. Firstly, the covariance matrix of the sensing signal is projected onto the statistical manifold. Each sensing signal can be regarded as a point on the manifold. Then, the geodesic distance between the signals is perceived as its statistical characteristics. Finally, deep neural network is used to classify the dataset composed of the geodesic distance. Simulation experiments show that the proposed spectrum sensing method based on deep neural network and information geometry has better performance in terms of sensing precision.


2020 ◽  
Vol 23 (2) ◽  
pp. 251-253
Author(s):  
F. D. Oikonomou ◽  
A. De Sanctis

This paper is based mainly on the relevant work [1]. In that paper the authors studied the problem of clustering of different shapes using Information Geometry tools including, among others, the Fisher Information and the resulting distance. Here we are using the same methods but for the geodesics of the alpha connection for three different values of the alpha parameter.


Entropy ◽  
2019 ◽  
Vol 21 (4) ◽  
pp. 332 ◽  
Author(s):  
Hao Wu ◽  
Yongqiang Cheng ◽  
Hongqiang Wang

Information geometry is the study of the intrinsic geometric properties of manifolds consisting of a probability distribution and provides a deeper understanding of statistical inference. Based on this discipline, this letter reports on the influence of the signal processing on the geometric structure of the statistical manifold in terms of estimation issues. This letter defines the intrinsic parameter submanifold, which reflects the essential geometric characteristics of the estimation issues. Moreover, the intrinsic parameter submanifold is proven to be a tighter one after signal processing. In addition, the necessary and sufficient condition of invariant signal processing of the geometric structure, i.e., isometric signal processing, is given. Specifically, considering the processing with the linear form, the construction method of linear isometric signal processing is proposed, and its properties are presented in this letter.


2003 ◽  
Vol 15 (1) ◽  
pp. 161-172 ◽  
Author(s):  
Nassar H. Abdel-All ◽  
H.N. Abd-Ellah ◽  
H.M. Moustafa

Author(s):  
Ryo Inokuchi ◽  
◽  
Sadaaki Miyamoto ◽  

In this paper, we discuss fuzzy clustering algorithms for discrete data. Data space is represented as a statistical manifold of the multinomial distribution, and then the Euclidean distance are not adequate in this setting. The geodesic distance on the multinomial manifold can be derived analytically, but it is difficult to use it as a metric directly. We propose fuzzyc-means algorithms using other metrics: the Kullback-Leibler divergence and the Hellinger distance, instead of the Euclidean distance. These two metrics are regarded as approximations of the geodesic distance.


2019 ◽  
Vol 12 (4) ◽  
pp. 423-446
Author(s):  
Jonathan Zinsl

AbstractWe prove the existence of nonnegative weak solutions to a class of second- and fourth-order nonautonomous nonlinear evolution equations with an explicitly time-dependent mobility function posed on the whole space {{{\mathbb{R}}^{d}}}, for arbitrary {d\geq 1}. Exploiting a very formal gradient flow structure, the cornerstone of our proof is a modified version of the classical minimizing movement scheme for gradient flows. The mobility function is required to satisfy – at each time point separately – the conditions by which one can define a modified Wasserstein distance on the space of probability densities with finite second moment. The explicit dependency on the time variable is assumed to be at least of Lipschitz regularity. We also sketch possible extensions of our result to the case of bounded spatial domains and more general mobility functions.


2021 ◽  
Vol 3 (1) ◽  
pp. 12
Author(s):  
Ariel Caticha

The mathematical formalism of quantum mechanics is derived or “reconstructed” from more basic considerations of the probability theory and information geometry. The starting point is the recognition that probabilities are central to QM; the formalism of QM is derived as a particular kind of flow on a finite dimensional statistical manifold—a simplex. The cotangent bundle associated to the simplex has a natural symplectic structure and it inherits its own natural metric structure from the information geometry of the underlying simplex. We seek flows that preserve (in the sense of vanishing Lie derivatives) both the symplectic structure (a Hamilton flow) and the metric structure (a Killing flow). The result is a formalism in which the Fubini–Study metric, the linearity of the Schrödinger equation, the emergence of complex numbers, Hilbert spaces and the Born rule are derived rather than postulated.


2019 ◽  
Vol 31 (5) ◽  
pp. 827-848 ◽  
Author(s):  
Shun-ichi Amari ◽  
Ryo Karakida ◽  
Masafumi Oizumi ◽  
Marco Cuturi

We propose a new divergence on the manifold of probability distributions, building on the entropic regularization of optimal transportation problems. As Cuturi ( 2013 ) showed, regularizing the optimal transport problem with an entropic term is known to bring several computational benefits. However, because of that regularization, the resulting approximation of the optimal transport cost does not define a proper distance or divergence between probability distributions. We recently tried to introduce a family of divergences connecting the Wasserstein distance and the Kullback-Leibler divergence from an information geometry point of view (see Amari, Karakida, & Oizumi, 2018 ). However, that proposal was not able to retain key intuitive aspects of the Wasserstein geometry, such as translation invariance, which plays a key role when used in the more general problem of computing optimal transport barycenters. The divergence we propose in this work is able to retain such properties and admits an intuitive interpretation.


Entropy ◽  
2020 ◽  
Vol 22 (5) ◽  
pp. 498 ◽  
Author(s):  
Frédéric Barbaresco ◽  
François Gay-Balmaz

In this paper, we describe and exploit a geometric framework for Gibbs probability densities and the associated concepts in statistical mechanics, which unifies several earlier works on the subject, including Souriau’s symplectic model of statistical mechanics, its polysymplectic extension, Koszul model, and approaches developed in quantum information geometry. We emphasize the role of equivariance with respect to Lie group actions and the role of several concepts from geometric mechanics, such as momentum maps, Casimir functions, coadjoint orbits, and Lie-Poisson brackets with cocycles, as unifying structures appearing in various applications of this framework to information geometry and machine learning. For instance, we discuss the expression of the Fisher metric in presence of equivariance and we exploit the property of the entropy of the Souriau model as a Casimir function to apply a geometric model for energy preserving entropy production. We illustrate this framework with several examples including multivariate Gaussian probability densities, and the Bogoliubov-Kubo-Mori metric as a quantum version of the Fisher metric for quantum information on coadjoint orbits. We exploit this geometric setting and Lie group equivariance to present symplectic and multisymplectic variational Lie group integration schemes for some of the equations associated with Souriau symplectic and polysymplectic models, such as the Lie-Poisson equation with cocycle.


Entropy ◽  
2019 ◽  
Vol 21 (5) ◽  
pp. 496
Author(s):  
Francisca Leidmar Josué Vieira ◽  
Luiza Helena Félix de Andrade ◽  
Rui Facundo Vigelis ◽  
Charles Casimiro Cavalcante

Consider μ a probability measure and P μ the set of μ -equivalent strictly positive probability densities. To endow P μ with a structure of a C ∞ -Banach manifold we use the φ -connection by an open arc, where φ is a deformed exponential function which assumes zero until a certain point and from then on is strictly increasing. This deformed exponential function has as particular cases the q-deformed exponential and κ -exponential functions. Moreover, we find the tangent space of P μ at a point p, and as a consequence the tangent bundle of P μ . We define a divergence using the q-exponential function and we prove that this divergence is related to the q-divergence already known from the literature. We also show that q-exponential and κ -exponential functions can be used to generalize of Rényi divergence.


Sign in / Sign up

Export Citation Format

Share Document