Supervised hidden Markov model learning using the state distribution Oracle

Author(s):  
L.G. Moscovich ◽  
Jianhua Chen
Open Physics ◽  
2017 ◽  
Vol 15 (1) ◽  
pp. 479-485 ◽  
Author(s):  
Zhiguo Zhao ◽  
Yeqin Wang ◽  
Xiaoming Hu ◽  
Yukai Tao ◽  
Jinsheng Wang

AbstractAiming at the problem of early warning credibility degradation as the heavy vehicle load and its center of gravity change greatly; the heavy vehicle rollover state identification method based on the Hidden Markov Model (HMM, is introduced to identify heavy vehicle lateral conditions dynamically in this paper. In this method, the lateral acceleration and roll angle are taken as the observation values of the model base. The Viterbi algorithm is used to predict the state sequence with the highest probability in the observed sequence, and the Markov prediction algorithm is adopted to calculate the state transition law and to predict the state of the vehicle in a certain period of time in the future. According to combination conditions of Double lane change and steering, applying Trucksim and Matlab trained hidden Markov model, the model is applied to the online identification of heavy vehicle rollover states. The identification results show that the model can accurately and efficiently identify the vehicle rollover state, and has good applicability. This study provides a novel method and a general strategy for active safety early warning and control of vehicles, which has reference significance for the application of the Hidden Markov theory in collision, rear-end and lane departure warning system.


2016 ◽  
Vol 14 (38) ◽  
pp. 63-72 ◽  
Author(s):  
Robin Cabeza Ruiz

There are two approaches for text segmentation by language: first, assuming that language changes happen in the “border” between sentences (never within a sentence); second, assuming that language changes can happen anyplace in the text. This work presents methods for both types of text’s segmentation by languages. On the first proposal, the text is initially segmented by sentence, then the language of each sentence is obtained; the second proposal is an adaptation of hidden Markov model to this task. Both cases, according to results obtained in experimental proofs, exceed the state of art.


2013 ◽  
Vol 7 (8) ◽  
pp. 704-709 ◽  
Author(s):  
Sabato Marco Siniscalchi ◽  
Jinyu Li ◽  
Chin‐Hui Lee

d'CARTESIAN ◽  
2015 ◽  
Vol 4 (1) ◽  
pp. 86 ◽  
Author(s):  
Kezia Tumilaar ◽  
Yohanes Langi ◽  
Altien Rindengan

Hidden Markov Models (HMM) is a stochastic model and is essentially an extension of Markov Chain. In Hidden Markov Model (HMM)  there are two types states: the observable states and the hidden states. The purpose of this research are to understand how hidden Markov model (HMM) and to understand how the solution of three basic problems on Hidden Markov Model (HMM) which consist of evaluation problem, decoding problem and learning problem.  The result of the research is hidden Markov model can be defined as . The evaluation problem or to compute probability of the observation sequence given the model P(O|) can solved  by Forward-Backward algorithm, the decoding problem or to choose hidden state sequence which is optimal can solved by Viterbi algorithm and learning problem or to estimate hidden Markov model parameter  to maximize P(O|)  can solved by Baum – Welch algorithm. From description above Hidden Markov Model  with state 3  can describe behavior  from the case studies. Key  words: Decoding Problem, Evaluation Problem, Hidden Markov Model, Learning Problem


2012 ◽  
Vol 2012 ◽  
pp. 1-13 ◽  
Author(s):  
Małgorzata Wiktoria Korolkiewicz

We propose a dependent hidden Markov model of credit quality. We suppose that the "true" credit quality is not observed directly but only through noisy observations given by posted credit ratings. The model is formulated in discrete time with a Markov chain observed in martingale noise, where "noise" terms of the state and observation processes are possibly dependent. The model provides estimates for the state of the Markov chain governing the evolution of the credit rating process and the parameters of the model, where the latter are estimated using the EM algorithm. The dependent dynamics allow for the so-called "rating momentum" discussed in the credit literature and also provide a convenient test of independence between the state and observation dynamics.


Author(s):  
David O Siegmund ◽  
Benjamin Yakir

In a hidden Markov model, one "estimates" the state of the hidden Markov chain at t by computing via the forwards-backwards algorithm the conditional distribution of the state vector given the observed data. The covariance matrix of this conditional distribution measures the information lost by failure to observe directly the state of the hidden process. In the case where changes of state occur slowly relative to the speed at which information about the underlying state accumulates in the observed data, we compute approximately these covariances in terms of functionals of Brownian motion that arise in change-point analysis. Applications in gene mapping, where these covariances play a role in standardizing the score statistic and in evaluating the loss of noncentrality due to incomplete information, are discussed. Numerical examples illustrate the range of validity and limitations of our results.


2017 ◽  
Vol 18 (3) ◽  
pp. 171-180 ◽  
Author(s):  
Asmaa A.E. Osman ◽  
Reda A. El-Khoribi ◽  
Mahmoud E. Shoman ◽  
M.A. Wahby Shalaby

Sign in / Sign up

Export Citation Format

Share Document