Robustification of an On-line EM Algorithm for Modelling Asset Prices Within an HMM

Author(s):  
Christina Erlwein-Sayer ◽  
Peter Ruckdeschel
Keyword(s):  
2006 ◽  
Author(s):  
David Cournapeau ◽  
Tatsuya Kawahara ◽  
Kenji Mase ◽  
Tomoji Toriyama

2001 ◽  
Vol 32 (5) ◽  
pp. 12-20 ◽  
Author(s):  
Junichiro Yoshimoto ◽  
Shin Ishii ◽  
Masa-aki Sato

1994 ◽  
Vol 6 (2) ◽  
pp. 181-214 ◽  
Author(s):  
Michael I. Jordan ◽  
Robert A. Jacobs

We present a tree-structured architecture for supervised learning. The statistical model underlying the architecture is a hierarchical mixture model in which both the mixture coefficients and the mixture components are generalized linear models (GLIM's). Learning is treated as a maximum likelihood problem; in particular, we present an Expectation-Maximization (EM) algorithm for adjusting the parameters of the architecture. We also develop an on-line learning algorithm in which the parameters are updated incrementally. Comparative simulation results are presented in the robot dynamics domain.


2000 ◽  
Vol 12 (11) ◽  
pp. 2685-2717 ◽  
Author(s):  
Dirk Husmeier

Training probability-density estimating neural networks with the expectation-maximization (EM) algorithm aims to maximize the likelihood of the training set and therefore leads to overfitting for sparse data. In this article, a regularization method for mixture models with generalized linear kernel centers is proposed, which adopts the Bayesian evidence approach and optimizes the hyperparameters of the prior by type II maximum likelihood. This includes a marginalization over the parameters, which is done by Laplace approximation and requires the derivation of the Hessian of the log-likelihood function. The incorporation of this approach into the standard training scheme leads to a modified form of the EM algorithm, which includes a regularization term and adapts the hyperparameters on-line after each EM cycle. The article presents applications of this scheme to classification problems, the prediction of stochastic time series, and latent space models.


Sign in / Sign up

Export Citation Format

Share Document