Learning mixture models using a genetic version of the EM algorithm

2000 ◽  
Vol 21 (8) ◽  
pp. 759-769 ◽  
Author(s):  
Aleix M. Martı́nez ◽  
Jordi Vitrià
2020 ◽  
Vol 41 ◽  
pp. 101073 ◽  
Author(s):  
Wentao Xiang ◽  
Ahmad Karfoul ◽  
Chunfeng Yang ◽  
Huazhong Shu ◽  
Régine Le Bouquin Jeannès

2021 ◽  
Author(s):  
◽  
Faezeh Frouzesh

<p>The use of mixture models in statistical analysis is increasing for datasets with heterogeneity and/or redundancy in the data. They are likelihood based models, and maximum likelihood estimates of parameters are obtained by the use of the expectation maximization (EM) algorithm. Multi-modality of the likelihood surface means that the EM algorithm is highly dependent on starting points and poorly chosen initial points for the optimization may lead to only a local maximum, not the global maximum. In this thesis, different methods of choosing initialising points in the EM algorithm will be evaluated and two procedures which make intelligent choices of possible starting points and fast evaluations of their usefulness will be presented. Furthermore, several approaches to measure the best model to fit from a set of models for a given dataset, will be investigated and some lemmas and theorems are presented to illustrate the information criterion. This work introduces two novel and heuristic methods to choose the best starting points for the EM algorithm that are named Combined method and Hybrid PSO (Particle Swarm Optimisation). Combined method is based on a combination of two clustering methods that leads to finding the best starting points in the EM algorithm in comparison with the different initialisation point methods. Hybrid PSO is a hybrid method of Particle Swarm Optimization (PSO) as a global optimization approach and the EM algorithm as a local search to overcome the EM algorithm’s problem that makes it independent to starting points. Finally it will be compared with different methods of choosing starting points in the EM algorithm.</p>


2021 ◽  
Author(s):  
Masahiro Kuroda

Mixture models become increasingly popular due to their modeling flexibility and are applied to the clustering and classification of heterogeneous data. The EM algorithm is largely used for the maximum likelihood estimation of mixture models because the algorithm is stable in convergence and simple in implementation. Despite such advantages, it is pointed out that the EM algorithm is local and has slow convergence as the main drawback. To avoid the local convergence of the EM algorithm, multiple runs from several different initial values are usually used. Then the algorithm may take a large number of iterations and long computation time to find the maximum likelihood estimates. The speedup of computation of the EM algorithm is available for these problems. We give the algorithms to accelerate the convergence of the EM algorithm and apply them to mixture model estimation. Numerical experiments examine the performance of the acceleration algorithms in terms of the number of iterations and computation time.


Biometrics ◽  
1988 ◽  
Vol 44 (2) ◽  
pp. 571 ◽  
Author(s):  
G. J. McLachlan ◽  
P. N. Jones

Sign in / Sign up

Export Citation Format

Share Document