scholarly journals Precision Annealing Monte Carlo Methods for Statistical Data Assimilation: Metropolis-Hastings Procedures

2019 ◽  
Author(s):  
Adrian S. Wong ◽  
Kangbo Hao ◽  
Zheng Fang ◽  
Henry D. I. Abarbanel

Abstract. Statistical Data Assimilation (SDA) is the transfer of information from field or laboratory observations to a user selected model of the dynamical system producing those observations. The data is noisy and the model has errors; the information transfer addresses properties of the conditional probability distribution of the states of the model conditioned on the observations. The quantities of interest in SDA are the conditional expected values of functions of the model state, and these require the approximate evaluation of high dimensional integrals. We introduce a conditional probability distribution and use the Laplace method with annealing to identify the maxima of the conditional probability distribution. The annealing method slowly increases the precision term of the model as it enters the Laplace method. In this paper, we extend the idea of precision annealing (PA) to Monte Carlo calculations of conditional expected values using Metropolis-Hastings methods.

2021 ◽  
Author(s):  
Faezeh Ghasemnezhad ◽  
Ommolbanin Bazrafshan ◽  
Mehdi Fazeli ◽  
Mohammad Parvinnia ◽  
Vijay Singh

Abstract Standardized Runoff Index (SRI), as one of the well-known hydrological drought indices, may contain uncertainties caused by the employment of the distribution function, time scale, and record length of statistical data. In this study, the uncertainty in the SRI estimation of monthly discharge data of 30 and 49 year length from Minab dam watershed, south of Iran, was investigated. Four probability distribution functions (Gamma, Weibull, Lognormal, and Normal) were used to fit the cumulative discharge data at 3, 6. 9, 12, 24 and 48 month time scales, with their goodness-of-fit and normality evaluated by K-S and normality tests, respectively. Using Monte-Carlo sampling, 50,000 statistical data were generated for each event and each time scale, followed by 95% confidence interval. The width of the confidence interval was used as uncertainty and sources of uncertainty were investigated using miscellaneous factors. It was found that the maximum uncertainty was related to normal and lognormal distributions and the minimum uncertainty to gamma and Weibull distributions. Further, the increase in both time scale and record length led to the decrease in uncertainty.


Author(s):  
Yang Xiang

Graphical models such as Bayesian networks (BNs) (Pearl, 1988) and decomposable Markov networks (DMNs) (Xiang, Wong & Cercone, 1997) have been applied widely to probabilistic reasoning in intelligent systems. Figure1 illustrates a BN and a DMN on a trivial uncertain domain: A virus can damage computer files, and so can a power glitch. A power glitch also causes a VCR to reset. The BN in (a) has four nodes, corresponding to four binary variables taking values from {true, false}. The graph structure encodes a set of dependence and independence assumptions (e.g., that f is directly dependent on v, and p but is independent of r, once the value of p is known). Each node is associated with a conditional probability distribution conditioned on its parent nodes (e.g., P(f | v, p)). The joint probability distribution is the product P(v, p, f, r) = P(f | v, p) P(r | p) P(v) P(p). The DMN in (b) has two groups of nodes that are maximally pair-wise connected, called cliques. Each clique is associated with a probability distribution (e.g., clique {v, p, f} is assigned P(v, p, f)). The joint probability distribution is P(v, p, f, r) = P(v, p, f) P(r, p) / P(p), where P(p) can be derived from one of the clique distributions. The networks, for instance, can be used to reason about whether there are viruses in the computer system, after observations on f and r are made.


1977 ◽  
Vol 80 (1) ◽  
pp. 99-128 ◽  
Author(s):  
Hiroji Nakagawa ◽  
Iehisa Nezu

In this paper we intend to predict the magnitude of the contribution to the Reynolds stress of bursting events: ‘ejections’, ‘sweeps’, ‘inward interactions’ and ‘outward interactions’. We shall do this by making use of the conditional probability distribution of the Reynolds stress −uv, which can be derived by applying the cumulant-discard method to the Gram-Charlier probability distribution of the two variablesuandv. The Reynolds-stress fluctuations in openchannel flows over smooth and rough beds are measured by dual-sensor hot-film anemometers, whose signals are conditionally sampled and sorted into the four quadrants of theu, vplane by using a high-speed digital data processing system.We shall verify that even the third-order conditional probability distribution of the Reynolds stress shows fairly good agreement with the experimental results and that the sequence of events in the bursting process, i.e. ejections, sweeps and interactions, is directly related to the turbulent energy budget in the form of turbulent diffusion. Also, we shall show that the roughness effect is marked in the area from the wall to the middle of the equilibrium region, and that sweeps appear to be more important than ejections as the roughness increases and as the distance from the wall decreases.


2020 ◽  
Vol 2 (1) ◽  
Author(s):  
Zheng Fang ◽  
Adrian S. Wong ◽  
Kangbo Hao ◽  
Alexander J. A. Ty ◽  
Henry D. I. Abarbanel

Sign in / Sign up

Export Citation Format

Share Document