Detecting and Exploiting Symmetry in Discrete-state Markov Models

Author(s):  
W. Douglas Ii Obal ◽  
Michael G. McQuinn ◽  
William H. Sanders
Keyword(s):  
2021 ◽  
Author(s):  
G. Zifarelli ◽  
P. Zuccolini ◽  
S. Bertelli ◽  
M. Pusch

ABSTRACT The behavior of ion channels and transporters is often modeled using discrete state continuous-time Markov models. Such models are helpful for the interpretation of experimental data and can guide the design of experiments by testing specific predictions. Here, we describe a computational tool that allows us to create Markov models of chosen complexity and to calculate the predictions on a macroscopic scale, as well on a single-molecule scale. The program calculates steady-state properties (current, state probabilities, and cycle frequencies), deterministic macroscopic and stochastic time courses, gating currents, dwell-time histograms, and power spectra of channels and transporters. In addition, a visual simulation mode allows us to follow the time-dependent stochastic behavior of a single channel or transporter. After a basic introduction into the concept of Markov models, real-life examples are discussed, including a model of a simple K+ channel, a voltage-gated sodium channel, a 3-state ligand-gated channel, and an electrogenic uniporter. In this manner, the article has a modular architecture, progressing from basic to more advanced topics. This illustrates how the MarkovEditor program can serve students to explore Markov models at a basic level but is also suited for research scientists to test and develop models on the mechanisms of protein function.


2015 ◽  
Vol 9 ◽  
pp. 379-391 ◽  
Author(s):  
L. S. Kuravsky ◽  
P. A. Marmalyuk ◽  
G. A. Yuryev ◽  
P. N. Dumin

2015 ◽  
Vol 47 (2) ◽  
pp. 378-401 ◽  
Author(s):  
B. Eriksson ◽  
M. R. Pistorius

This paper is concerned with the solution of the optimal stopping problem associated to the value of American options driven by continuous-time Markov chains. The value-function of an American option in this setting is characterised as the unique solution (in a distributional sense) of a system of variational inequalities. Furthermore, with continuous and smooth fit principles not applicable in this discrete state-space setting, a novel explicit characterisation is provided of the optimal stopping boundary in terms of the generator of the underlying Markov chain. Subsequently, an algorithm is presented for the valuation of American options under Markov chain models. By application to a suitably chosen sequence of Markov chains, the algorithm provides an approximate valuation of an American option under a class of Markov models that includes diffusion models, exponential Lévy models, and stochastic differential equations driven by Lévy processes. Numerical experiments for a range of different models suggest that the approximation algorithm is flexible and accurate. A proof of convergence is also provided.


2015 ◽  
Vol 47 (02) ◽  
pp. 378-401
Author(s):  
B. Eriksson ◽  
M. R. Pistorius

This paper is concerned with the solution of the optimal stopping problem associated to the value of American options driven by continuous-time Markov chains. The value-function of an American option in this setting is characterised as the unique solution (in a distributional sense) of a system of variational inequalities. Furthermore, with continuous and smooth fit principles not applicable in this discrete state-space setting, a novel explicit characterisation is provided of the optimal stopping boundary in terms of the generator of the underlying Markov chain. Subsequently, an algorithm is presented for the valuation of American options under Markov chain models. By application to a suitably chosen sequence of Markov chains, the algorithm provides an approximate valuation of an American option under a class of Markov models that includes diffusion models, exponential Lévy models, and stochastic differential equations driven by Lévy processes. Numerical experiments for a range of different models suggest that the approximation algorithm is flexible and accurate. A proof of convergence is also provided.


Automatica ◽  
2016 ◽  
Vol 65 ◽  
pp. 12-26 ◽  
Author(s):  
Dawei Shi ◽  
Robert J. Elliott ◽  
Tongwen Chen

2010 ◽  
Vol 8 (3) ◽  
pp. 376-379 ◽  
Author(s):  
Renato Cesar Sato ◽  
Désirée Moraes Zouain

ABSTRACT Markov Chains provide support for problems involving decision on uncertainties through a continuous period of time. The greater availability and access to processing power through computers allow that these models can be used more often to represent clinical structures. Markov models consider the patients in a discrete state of health, and the events represent the transition from one state to another. The possibility of modeling repetitive events and time dependence of probabilities and utilities associated permits a more accurate representation of the evaluated clinical structure. These templates can be used for economic evaluation in health care taking into account the evaluation of costs and clinical outcomes, especially for evaluation of chronic diseases. This article provides a review of the use of modeling within the clinical context and the advantages of the possibility of including time for this type of study.


2006 ◽  
Vol 13 (3) ◽  
pp. 339-352 ◽  
Author(s):  
Ath. Kehagias ◽  
V. Fortin

Abstract. We present a new family of hidden Markov models and apply these to the segmentation of hydrological and environmental time series. The proposed hidden Markov models have a discrete state space and their structure is inspired from the shifting means models introduced by Chernoff and Zacks and by Salas and Boes. An estimation method inspired from the EM algorithm is proposed, and we show that it can accurately identify multiple change-points in a time series. We also show that the solution obtained using this algorithm can serve as a starting point for a Monte-Carlo Markov chain Bayesian estimation method, thus reducing the computing time needed for the Markov chain to converge to a stationary distribution.


Sign in / Sign up

Export Citation Format

Share Document