scholarly journals Fast Computation of the EM Algorithm for Mixture Models

2021 ◽  
Author(s):  
Masahiro Kuroda

Mixture models become increasingly popular due to their modeling flexibility and are applied to the clustering and classification of heterogeneous data. The EM algorithm is largely used for the maximum likelihood estimation of mixture models because the algorithm is stable in convergence and simple in implementation. Despite such advantages, it is pointed out that the EM algorithm is local and has slow convergence as the main drawback. To avoid the local convergence of the EM algorithm, multiple runs from several different initial values are usually used. Then the algorithm may take a large number of iterations and long computation time to find the maximum likelihood estimates. The speedup of computation of the EM algorithm is available for these problems. We give the algorithms to accelerate the convergence of the EM algorithm and apply them to mixture model estimation. Numerical experiments examine the performance of the acceleration algorithms in terms of the number of iterations and computation time.

1995 ◽  
Vol 12 (5) ◽  
pp. 515-527 ◽  
Author(s):  
Jeanine J. Houwing-Duistermaat ◽  
Lodewijk A. Sandkuijl ◽  
Arthur A. B. Bergen ◽  
Hans C. van Houwelingen

Mathematics ◽  
2021 ◽  
Vol 9 (19) ◽  
pp. 2413
Author(s):  
Ruijie Guan ◽  
Xu Zhao ◽  
Weihu Cheng ◽  
Yaohua Rong

In this paper, a new generalized t (new Gt) distribution based on a distribution construction approach is proposed and proved to be suitable for fitting both the data with high kurtosis and heavy tail. The main innovation of this article consists of four parts. First of all, the main characteristics and properties of this new distribution are outined. Secondly, we derive the explicit expression for the moments of order statistics as well as its corresponding variance–covariance matrix. Thirdly, we focus on the parameter estimation of this new Gt distribution and introduce several estimation methods, such as a modified method of moments (MMOM), a maximum likelihood estimation (MLE) using the EM algorithm, a novel iterative algorithm to acquire MLE, and improved probability weighted moments (IPWM). Through simulation studies, it can be concluded that the IPWM estimation performs better than the MLE using the EM algorithm and the MMOM in general. The newly-proposed iterative algorithm has better performance than the EM algorithm when the sample kurtosis is greater than 2.7. For four parameters of the new Gt distribution, a profile maximum likelihood approach using the EM algorithm is developed to deal with the estimation problem and obtain acceptable.


2021 ◽  
Author(s):  
◽  
Faezeh Frouzesh

<p>The use of mixture models in statistical analysis is increasing for datasets with heterogeneity and/or redundancy in the data. They are likelihood based models, and maximum likelihood estimates of parameters are obtained by the use of the expectation maximization (EM) algorithm. Multi-modality of the likelihood surface means that the EM algorithm is highly dependent on starting points and poorly chosen initial points for the optimization may lead to only a local maximum, not the global maximum. In this thesis, different methods of choosing initialising points in the EM algorithm will be evaluated and two procedures which make intelligent choices of possible starting points and fast evaluations of their usefulness will be presented. Furthermore, several approaches to measure the best model to fit from a set of models for a given dataset, will be investigated and some lemmas and theorems are presented to illustrate the information criterion. This work introduces two novel and heuristic methods to choose the best starting points for the EM algorithm that are named Combined method and Hybrid PSO (Particle Swarm Optimisation). Combined method is based on a combination of two clustering methods that leads to finding the best starting points in the EM algorithm in comparison with the different initialisation point methods. Hybrid PSO is a hybrid method of Particle Swarm Optimization (PSO) as a global optimization approach and the EM algorithm as a local search to overcome the EM algorithm’s problem that makes it independent to starting points. Finally it will be compared with different methods of choosing starting points in the EM algorithm.</p>


2002 ◽  
Vol 14 (6) ◽  
pp. 1261-1266 ◽  
Author(s):  
Akihiro Minagawa ◽  
Norio Tagawa ◽  
Toshiyuki Tanaka

The expectation-maximization (EM) algorithm with split-and-merge operations (SMEM algorithm) proposed by Ueda, Nakano, Ghahramani, and Hinton (2000) is a nonlocal searching method, applicable to mixture models, for relaxing the local optimum property of the EM algorithm. In this article, we point out that the SMEM algorithm uses the acceptance-rejection evaluation method, which may pick up a distribution with smaller likelihood, and demonstrate that an increase in likelihood can then be guaranteed only by comparing log likelihoods.


Sign in / Sign up

Export Citation Format

Share Document