scholarly journals Hidden-Markov program algebra with iteration

2014 ◽  
Vol 25 (2) ◽  
pp. 320-360 ◽  
Author(s):  
ANNABELLE MCIVER ◽  
LARISSA MEINICKE ◽  
CARROLL MORGAN

We use hidden Markov models to motivate a quantitative compositional semantics for noninterference-based security with iteration, including a refinement- or ‘implements’ relation that compares two programs with respect to their information leakage; and we propose a program algebra for source-level reasoning about such programs, in particular as a means of establishing that an ‘implementation’ program leaks no more than its ‘specification’ program.This joins two themes: we extend our earlier work, having iteration but only qualitative (Morgan 2009), by making it quantitative; and we extend our earlier quantitative work (McIver et al. 2010) by including iteration.We advocate stepwise refinement and source-level program algebra – both as conceptual reasoning tools and as targets for automated assistance. A selection of algebraic laws is given to support this view in the case of quantitative noninterference; and it is demonstrated on a simple iterated password-guessing attack.

Author(s):  
I. GALIANO ◽  
E. SANCHIS ◽  
F. CASACUBERTA ◽  
I. TORRES

The design of current acoustic-phonetic decoders for a specific language involves the selection of an adequate set of sublexical units, and a choice of the mathematical framework for modelling the corresponding units. In this work, the baseline chosen for continuous Spanish speech consists of 23 sublexical units that roughly correspond to the 24 Spanish phonemes. The process of selection of such a baseline was based on language phonetic criteria and some experiments with an available speech corpora. On the other hand, two types of models were chosen for this work, conventional Hidden Markov Models and Inferred Stochastic Regular Grammars. With these two choices we could compare classical Hidden Markov modelling where the structure of a unit-model is deductively supplied, with Grammatical Inference modelling where the baseforms of model-units are automatically generated from training samples. The best speaker-independent phone recognition rate was 64% for the first type of modelling, and 66% for the second type.


Author(s):  
Kwan Yi ◽  
Jamshid Beheshti

In document representation for digitalized text, feature selection refers to the selection of the terms of representing a document and of distinguishing it from other documents. This study probes different feature selection methods for HMM learning models to explore how they affect the model performance, which is experimented in the context of text categorization task.Dans la représentation documentaire des textes numérisés, la sélection des caractéristiques se fonde sur la sélection des termes représentant et distinguant un document des autres documents. Cette étude examine différents modèles de sélection de caractéristiques pour les modèles d’apprentissage MMC, afin d’explorer comment ils affectent la performance du modèle, qui est observé dans le contexte de la tâche de catégorisation textuelle. 


2014 ◽  
Vol 25 (2) ◽  
pp. 292-319 ◽  
Author(s):  
MICHELE BOREALE ◽  
FRANCESCA PAMPALONI ◽  
MICHELA PAOLINI

We study the asymptotic behaviour of (a) information leakage and (b) adversary's error probability in information hiding systems modelled as noisy channels. Specifically, we assume the attacker can make a single guess after observing n independent executions of the system, throughout which the secret information is kept fixed. We show that the asymptotic behaviour of quantities (a) and (b) can be determined in a simple way from the channel matrix. Moreover, simple and tight bounds on them as functions of n show that the convergence is exponential. We also discuss feasible methods to evaluate the rate of convergence. Our results cover both the Bayesian case, where an a priori probability distribution on the secrets is assumed known to the attacker, and the maximum-likelihood case, where the attacker does not know such distribution. In the Bayesian case, we identify the distributions that maximize leakage. We consider both the min-entropy setting studied by Smith and the additive form recently proposed by Braun et al. and show the two forms do agree asymptotically. Next, we extend these results to a more sophisticated eavesdropping scenario, where the attacker can perform a (noisy) observation at each state of the computation and the systems are modelled as hidden Markov models.


2003 ◽  
Vol 24 (9-10) ◽  
pp. 1395-1407 ◽  
Author(s):  
Manuele Bicego ◽  
Vittorio Murino ◽  
Mário A.T. Figueiredo

2015 ◽  
Vol 135 (12) ◽  
pp. 1517-1523 ◽  
Author(s):  
Yicheng Jin ◽  
Takuto Sakuma ◽  
Shohei Kato ◽  
Tsutomu Kunitachi

Author(s):  
M. Vidyasagar

This book explores important aspects of Markov and hidden Markov processes and the applications of these ideas to various problems in computational biology. It starts from first principles, so that no previous knowledge of probability is necessary. However, the work is rigorous and mathematical, making it useful to engineers and mathematicians, even those not interested in biological applications. A range of exercises is provided, including drills to familiarize the reader with concepts and more advanced problems that require deep thinking about the theory. Biological applications are taken from post-genomic biology, especially genomics and proteomics. The topics examined include standard material such as the Perron–Frobenius theorem, transient and recurrent states, hitting probabilities and hitting times, maximum likelihood estimation, the Viterbi algorithm, and the Baum–Welch algorithm. The book contains discussions of extremely useful topics not usually seen at the basic level, such as ergodicity of Markov processes, Markov Chain Monte Carlo (MCMC), information theory, and large deviation theory for both i.i.d and Markov processes. It also presents state-of-the-art realization theory for hidden Markov models. Among biological applications, it offers an in-depth look at the BLAST (Basic Local Alignment Search Technique) algorithm, including a comprehensive explanation of the underlying theory. Other applications such as profile hidden Markov models are also explored.


Sign in / Sign up

Export Citation Format

Share Document