gamma mixtures
Recently Published Documents


TOTAL DOCUMENTS

10
(FIVE YEARS 2)

H-INDEX

3
(FIVE YEARS 1)

Sensors ◽  
2021 ◽  
Vol 22 (1) ◽  
pp. 186
Author(s):  
Sami Bourouis ◽  
Yogesh Pawar ◽  
Nizar Bouguila

Finite Gamma mixture models have proved to be flexible and can take prior information into account to improve generalization capability, which make them interesting for several machine learning and data mining applications. In this study, an efficient Gamma mixture model-based approach for proportional vector clustering is proposed. In particular, a sophisticated entropy-based variational algorithm is developed to learn the model and optimize its complexity simultaneously. Moreover, a component-splitting principle is investigated, here, to handle the problem of model selection and to prevent over-fitting, which is an added advantage, as it is done within the variational framework. The performance and merits of the proposed framework are evaluated on multiple, real-challenging applications including dynamic textures clustering, objects categorization and human gesture recognition.


Author(s):  
Frank Nielsen ◽  
Ke Sun

Information-theoretic measures such as the entropy, cross-entropy and the Kullback-Leibler divergence between two mixture models is a core primitive in many signal processing tasks. Since the Kullback-Leibler divergence of mixtures provably does not admit a closed-form formula, it is in practice either estimated using costly Monte-Carlo stochastic integration, approximated, or bounded using various techniques. We present a fast and generic method that builds algorithmically closed-form lower and upper bounds on the entropy, the cross-entropy and the Kullback-Leibler divergence of mixtures. We illustrate the versatile method by reporting on our experiments for approximating the Kullback-Leibler divergence between univariate exponential mixtures, Gaussian mixtures, Rayleigh mixtures, and Gamma mixtures.


Sign in / Sign up

Export Citation Format

Share Document