Mixed-coded evolutionary algorithm for Gaussian mixture maximum likelihood clustering with model selection

Author(s):  
Hazem M. Abbas
2011 ◽  
Vol 58-60 ◽  
pp. 1847-1853 ◽  
Author(s):  
Yan Zhang ◽  
Cun Bao Chen ◽  
Li Zhao

In this paper, Gaussian Mixture model (GMM) as specific method is applied to noise classification. On this basis, a modified Gaussian Mixture Model with an embedded Auto-Associate Neural Network (AANN) is proposed. It integrates the merits of GMM and AANN. We train GMM and AANN as a whole and they are trained by means of Maximum Likelihood (ML). In the process of training, the parameter of GMM and AANN are updated alternately. AANN reshapes the distribution of the data and improves the similarity of the feature data in the same distribution type of noise. Experiments show that the GMM with embedded AANN improves accuracy rate of noise classification against baseline GMM.


2006 ◽  
Vol 18 (5) ◽  
pp. 1007-1065 ◽  
Author(s):  
Shun-ichi Amari ◽  
Hyeyoung Park ◽  
Tomoko Ozeki

The parameter spaces of hierarchical systems such as multilayer perceptrons include singularities due to the symmetry and degeneration of hidden units. A parameter space forms a geometrical manifold, called the neuromanifold in the case of neural networks. Such a model is identified with a statistical model, and a Riemannian metric is given by the Fisher information matrix. However, the matrix degenerates at singularities. Such a singular structure is ubiquitous not only in multilayer perceptrons but also in the gaussian mixture probability densities, ARMA time-series model, and many other cases. The standard statistical paradigm of the Cramér-Rao theorem does not hold, and the singularity gives rise to strange behaviors in parameter estimation, hypothesis testing, Bayesian inference, model selection, and in particular, the dynamics of learning from examples. Prevailing theories so far have not paid much attention to the problem caused by singularity, relying only on ordinary statistical theories developed for regular (nonsingular) models. Only recently have researchers remarked on the effects of singularity, and theories are now being developed. This article gives an overview of the phenomena caused by the singularities of statistical manifolds related to multilayer perceptrons and gaussian mixtures. We demonstrate our recent results on these problems. Simple toy models are also used to show explicit solutions. We explain that the maximum likelihood estimator is no longer subject to the gaussian distribution even asymptotically, because the Fisher information matrix degenerates, that the model selection criteria such as AIC, BIC, and MDL fail to hold in these models, that a smooth Bayesian prior becomes singular in such models, and that the trajectories of dynamics of learning are strongly affected by the singularity, causing plateaus or slow manifolds in the parameter space. The natural gradient method is shown to perform well because it takes the singular geometrical structure into account. The generalization error and the training error are studied in some examples.


Sign in / Sign up

Export Citation Format

Share Document