scholarly journals Detection and identification of changes of hidden Markov chains: asymptotic theory

Author(s):  
Savas Dayanik ◽  
Kazutoshi Yamazaki

AbstractThis paper revisits a unified framework of sequential change-point detection and hypothesis testing modeled using hidden Markov chains and develops its asymptotic theory. Given a sequence of observations whose distributions are dependent on a hidden Markov chain, the objective is to quickly detect critical events, modeled by the first time the Markov chain leaves a specific set of states, and to accurately identify the class of states that the Markov chain enters. We propose computationally tractable sequential detection and identification strategies and obtain sufficient conditions for the asymptotic optimality in two Bayesian formulations. Numerical examples are provided to confirm the asymptotic optimality.

1997 ◽  
Vol 18 (6) ◽  
pp. 553-578 ◽  
Author(s):  
Christian Francq ◽  
Michel Roussignol

2003 ◽  
Vol 31 (2) ◽  
pp. 11-13 ◽  
Author(s):  
Flávio P. Duarte ◽  
Edmundo de Souza e Silva ◽  
Don Towsley

2011 ◽  
Vol 23 (10) ◽  
pp. 1659-1670 ◽  
Author(s):  
Antonio Ciampi ◽  
Alina Dyachenko ◽  
Martin Cole ◽  
Jane McCusker

ABSTRACTBackground: The study of mental disorders in the elderly presents substantial challenges due to population heterogeneity, coexistence of different mental disorders, and diagnostic uncertainty. While reliable tools have been developed to collect relevant data, new approaches to study design and analysis are needed. We focus on a new analytic approach.Methods: Our framework is based on latent class analysis and hidden Markov chains. From repeated measurements of a multivariate disease index, we extract the notion of underlying state of a patient at a time point. The course of the disorder is then a sequence of transitions among states. States and transitions are not observable; however, the probability of being in a state at a time point, and the transition probabilities from one state to another over time can be estimated.Results: Data from 444 patients with and without diagnosis of delirium and dementia were available from a previous study. The Delirium Index was measured at diagnosis, and at 2 and 6 months from diagnosis. Four latent classes were identified: fairly healthy, moderately ill, clearly sick, and very sick. Dementia and delirium could not be separated on the basis of these data alone. Indeed, as the probability of delirium increased, so did the probability of decline of mental functions. Eight most probable courses were identified, including good and poor stable courses, and courses exhibiting various patterns of improvement.Conclusion: Latent class analysis and hidden Markov chains offer a promising tool for studying mental disorders in the elderly. Its use may show its full potential as new data become available.


1985 ◽  
Vol 22 (01) ◽  
pp. 138-147 ◽  
Author(s):  
Wojciech Szpankowski

Some sufficient conditions for non-ergodicity are given for a Markov chain with denumerable state space. These conditions generalize Foster's results, in that unbounded Lyapunov functions are considered. Our criteria directly extend the conditions obtained in Kaplan (1979), in the sense that a class of Lyapunov functions is studied. Applications are presented through some examples; in particular, sufficient conditions for non-ergodicity of a multidimensional Markov chain are given.


1993 ◽  
Vol 30 (3) ◽  
pp. 518-528 ◽  
Author(s):  
Frank Ball ◽  
Geoffrey F. Yeo

We consider lumpability for continuous-time Markov chains and provide a simple probabilistic proof of necessary and sufficient conditions for strong lumpability, valid in circumstances not covered by known theory. We also consider the following marginalisability problem. Let {X{t)} = {(X1(t), X2(t), · ··, Xm(t))} be a continuous-time Markov chain. Under what conditions are the marginal processes {X1(t)}, {X2(t)}, · ··, {Xm(t)} also continuous-time Markov chains? We show that this is related to lumpability and, if no two of the marginal processes can jump simultaneously, then they are continuous-time Markov chains if and only if they are mutually independent. Applications to ion channel modelling and birth–death processes are discussed briefly.


1985 ◽  
Vol 22 (1) ◽  
pp. 138-147 ◽  
Author(s):  
Wojciech Szpankowski

Some sufficient conditions for non-ergodicity are given for a Markov chain with denumerable state space. These conditions generalize Foster's results, in that unbounded Lyapunov functions are considered. Our criteria directly extend the conditions obtained in Kaplan (1979), in the sense that a class of Lyapunov functions is studied. Applications are presented through some examples; in particular, sufficient conditions for non-ergodicity of a multidimensional Markov chain are given.


1998 ◽  
Vol 30 (2) ◽  
pp. 365-384 ◽  
Author(s):  
Yiqiang Q. Zhao ◽  
Wei Li ◽  
W. John Braun

In this paper, we study Markov chains with infinite state block-structured transition matrices, whose states are partitioned into levels according to the block structure, and various associated measures. Roughly speaking, these measures involve first passage times or expected numbers of visits to certain levels without hitting other levels. They are very important and often play a key role in the study of a Markov chain. Necessary and/or sufficient conditions are obtained for a Markov chain to be positive recurrent, recurrent, or transient in terms of these measures. Results are obtained for general irreducible Markov chains as well as those with transition matrices possessing some block structure. We also discuss the decomposition or the factorization of the characteristic equations of these measures. In the scalar case, we locate the zeros of these characteristic functions and therefore use these zeros to characterize a Markov chain. Examples and various remarks are given to illustrate some of the results.


Sign in / Sign up

Export Citation Format

Share Document