A Nonparametric Bayesian Framework for Multivariate Beta Mixture Models

Author(s):  
Mahsa Amirkhani ◽  
Narges Manouchehri ◽  
Nizar Bouguila
2013 ◽  
Vol 120 (4) ◽  
pp. 817-851 ◽  
Author(s):  
Joseph L. Austerweil ◽  
Thomas L. Griffiths

2021 ◽  
Vol 11 (13) ◽  
pp. 5798
Author(s):  
Sami Bourouis ◽  
Roobaea Alroobaea ◽  
Saeed Rubaiee ◽  
Murad Andejany ◽  
Nizar Bouguila

This paper addresses the problem of data vectors modeling, classification and recognition using infinite mixture models, which have been shown to be an effective alternative to finite mixtures in terms of selecting the optimal number of clusters. In this work, we propose a novel approach for localized features modelling using an infinite mixture model based on multivariate generalized Normal distributions (inMGNM). The statistical mixture is learned via a nonparametric MCMC-based Bayesian approach in order to avoid the crucial problem of model over-fitting and to allow uncertainty in the number of mixture components. Robust descriptors are derived from encoding features with the Fisher vector method, which considers higher order statistics. These descriptors are combined with a linear support vector machine classifier in order to achieve higher accuracy. The efficiency and merits of the proposed nonparametric Bayesian learning approach, while comparing it to other different methods, are demonstrated via two challenging applications, namely texture classification and human activity categorization.


Sign in / Sign up

Export Citation Format

Share Document