scholarly journals REGIONALISATION OF POSTERIOR PROBABILITY DISTRIBUTION OF MODEL PARAMETERS : PREDECTION ON UNGAGUED BASIN

2008 ◽  
Vol 52 ◽  
pp. 103-108
Author(s):  
Satish BASTOLA ◽  
Hiroshi ISHIDAIRA ◽  
Kuniyoshi TAKEUCHI
Author(s):  
Luis D. Couto ◽  
Dong Zhang ◽  
Antti Aitio ◽  
Scott Moura ◽  
David Howey

Abstract This paper addresses the parameter estimation problem for lithium-ion battery pack models comprising cells in series. This valuable information can be exploited in fault diagnostics to estimate the number of cells that are exhibiting abnormal behaviour, e.g. large resistances or small capacities. In particular, we use a Bayesian approach to estimate the parameters of a two-cell arrangement modelled using equivalent circuits. Although our modeling framework has been extensively reported in the literature, its structural identifiability properties have not been reported yet to the best of the authors’ knowledge. Moreover, most contributions in the literature tackle the estimation problem through point-wise estimates assuming Gaussian noise using e.g. least-squares methods (maximum likelihood estimation) or Kalman filters (maximum a posteriori estimation). In contrast, we apply methods that are suitable for nonlinear and non-Gaussian estimation problems and estimate the full posterior probability distribution of the parameters. We study how the model structure, available measurements and prior knowledge of the model parameters impact the underlying posterior probability distribution that is recovered for the parameters. For two cells in series, a bimodal distribution is obtained whose modes are centered around the real values of the parameters for each cell. Therefore, bounds on the model parameters for a battery pack can be derived.


Author(s):  
Munir S Pathan ◽  
S M Pradhan ◽  
T Palani Selvam

Abstract In this study, the Bayesian probabilistic approach is applied for the estimation of the actual dose using personnel monitoring dose records of occupational workers. To implement the Bayesian approach, the probability distribution of the uncertainty in the reported dose as a function of the actual dose is derived. Using the uncertainty distribution function of reported dose and prior knowledge of dose levels generally observed in a monitoring period, the posterior probability distribution of the actual dose is estimated. The posterior distributions of each monitoring period in a year are convoluted to arrive at actual annual dose distribution. The estimated actual doses distributions show a significant deviation from reported annual doses particularly for low annual doses.


2020 ◽  
Vol 09 (04) ◽  
pp. 2050017
Author(s):  
Benjamin D. Donovan ◽  
Randall L. McEntaffer ◽  
Casey T. DeRoo ◽  
James H. Tutt ◽  
Fabien Grisé ◽  
...  

The soft X-ray grating spectrometer on board the Off-plane Grating Rocket Experiment (OGRE) hopes to achieve the highest resolution soft X-ray spectrum of an astrophysical object when it is launched via suborbital rocket. Paramount to the success of the spectrometer are the performance of the [Formula: see text] reflection gratings populating its reflection grating assembly. To test current grating fabrication capabilities, a grating prototype for the payload was fabricated via electron-beam lithography at The Pennsylvania State University’s Materials Research Institute and was subsequently tested for performance at Max Planck Institute for Extraterrestrial Physics’ PANTER X-ray Test Facility. Bayesian modeling of the resulting data via Markov chain Monte Carlo (MCMC) sampling indicated that the grating achieved the OGRE single-grating resolution requirement of [Formula: see text] at the 94% confidence level. The resulting [Formula: see text] posterior probability distribution suggests that this confidence level is likely a conservative estimate though, since only a finite [Formula: see text] parameter space was sampled and the model could not constrain the upper bound of [Formula: see text] to less than infinity. Raytrace simulations of the tested system found that the observed data can be reproduced with a grating performing at [Formula: see text]. It is therefore postulated that the behavior of the obtained [Formula: see text] posterior probability distribution can be explained by a finite measurement limit of the system and not a finite limit on [Formula: see text]. Implications of these results and improvements to the test setup are discussed.


2013 ◽  
Vol 807-809 ◽  
pp. 1570-1574 ◽  
Author(s):  
Hai Dong Yang ◽  
Dong Guo Shao ◽  
Bi Yu Liu

Pollution point source identification for the non-shore emission which is the main form of sudden water pollution incident is considered in this paper. Firstly, the source traceability of sudden water pollution accidents is taken as the Bayesian estimation problem; secondly, the posterior probability distribution of the source's parameters are deduced; thirdly, the marginal posterior probability density is obtained by using a new traceability method; finally, this proposed method is compared with Bayesian-MCMC by numerical experiments. The conclusions are as following: the new traceability method can reduce the iterations, improve the recognition accuracy, and reduce the overall average error obviously and it is more stable and robust than Bayesian-MCMC and can identify sudden water pollution accidents source effectively. Therefore, it provides a new idea and method to solve the difficulty of traceability problems in sudden water pollution accidents.


2014 ◽  
Vol 10 (S306) ◽  
pp. 273-275
Author(s):  
Pedro T. P. Viana

AbstractObservational data on clusters of galaxies holds relevant information that can be used to determine the relative plausibility of different models for the large-scale evolution of the Universe, or estimate the joint posterior probability distribution function of the parameters that pertain to each model. Within the next few years, several surveys of the sky will yield large galaxy cluster catalogues. In order to make use of the vast amount of information they will contain, their selection functions will have to be properly understood. We argue this, as well as the estimation of the full joint posterior probability distribution function of the most relevant cluster properties, can be best achieved in the framework of bayesian statistics.


2020 ◽  
Author(s):  
Xin Zhang ◽  
Andrew Curtis

<p><span>In a variety of geoscientific applications we require maps of subsurface properties together with the corresponding maps of uncertainties to assess their reliability. Seismic tomography is a method that is widely used to generate those maps. Since tomography is significantly nonlinear, Monte Carlo sampling methods are often used for this purpose, but they are generally computationally intractable for large data sets and high-dimensionality parameter spaces. To extend uncertainty analysis to larger systems, we introduce variational inference methods to conduct seismic tomography. In contrast to Monte Carlo sampling, variational methods solve the Bayesian inference problem as an optimization problem yet still provide fully nonlinear, probabilistic results. This is achieved by minimizing the Kullback-Leibler (KL) divergence between approximate and target probability distributions within a predefined family of probability distributions.</span></p><p><span>We introduce two variational inference methods: automatic differential variational inference (ADVI) and Stein variational gradient descent (SVGD). In ADVI a Gaussian probability distribution is assumed and optimized to approximate the posterior probability distribution. In SVGD a smooth transform is iteratively applied to an initial probability distribution to obtain an approximation to the posterior probability distribution. At each iteration the transform is determined by seeking the steepest descent direction that minimizes the KL-divergence. </span></p><p><span>We apply the two variational inference methods to 2D travel time tomography using both synthetic and real data, and compare the results to those obtained from two different Monte Carlo sampling methods: Metropolis-Hastings Markov chain Monte Carlo (MH-McMC) and reversible jump Markov chain Monte Carlo (rj-McMC). The results show that ADVI provides a biased approximation because of its Gaussian approximation, whereas SVGD produces more accurate approximations to the results of MH-McMC. In comparison rj-McMC produces smoother mean velocity models and lower standard deviations because the parameterization used in rj-McMC (Voronoi cells) imposes prior restrictions on the pixelated form of models: all pixels within each Voronoi cell have identical velocities. This suggests that the results of rj-McMC need to be interpreted in the light of the specific prior information imposed by the parameterization. Both variational methods estimate the posterior distribution at significantly lower computational cost, provided that gradients of parameters with respect to data can be calculated efficiently. We therefore expect that the methods can be applied fruitfully to many other types of geophysical inverse problems.</span></p>


2020 ◽  
Vol 76 (3) ◽  
pp. 238-247 ◽  
Author(s):  
Randy J. Read ◽  
Robert D. Oeffner ◽  
Airlie J. McCoy

The information gained by making a measurement, termed the Kullback–Leibler divergence, assesses how much more precisely the true quantity is known after the measurement was made (the posterior probability distribution) than before (the prior probability distribution). It provides an upper bound for the contribution that an observation can make to the total likelihood score in likelihood-based crystallographic algorithms. This makes information gain a natural criterion for deciding which data can legitimately be omitted from likelihood calculations. Many existing methods use an approximation for the effects of measurement error that breaks down for very weak and poorly measured data. For such methods a different (higher) information threshold is appropriate compared with methods that account well for even large measurement errors. Concerns are raised about a current trend to deposit data that have been corrected for anisotropy, sharpened and pruned without including the original unaltered measurements. If not checked, this trend will have serious consequences for the reuse of deposited data by those who hope to repeat calculations using improved new methods.


2019 ◽  
Vol 485 (3) ◽  
pp. 4343-4358
Author(s):  
Germán Chaparro-Molano ◽  
Juan Carlos Cuervo ◽  
Oscar Alberto Restrepo Gaitán ◽  
Sergio Torres Arzayús

ABSTRACT We propose the use of robust, Bayesian methods for estimating extragalactic distance errors in multimeasurement catalogues. We seek to improve upon the more commonly used frequentist propagation-of-error methods, as they fail to explain both the scatter between different measurements and the effects of skewness in the metric distance probability distribution. For individual galaxies, the most transparent way to assess the variance of redshift independent distances is to directly sample the posterior probability distribution obtained from the mixture of reported measurements. However, sampling the posterior can be cumbersome for catalogue-wide precision cosmology applications. We compare the performance of frequentist methods versus our proposed measures for estimating the true variance of the metric distance probability distribution. We provide pre-computed distance error data tables for galaxies in three catalogues: NED-D, HyperLEDA, and Cosmicflows-3. Additionally, we develop a Bayesian model that considers systematic and random effects in the estimation of errors for Tully–Fisher (TF) relation derived distances in NED-D. We validate this model with a Bayesian p-value computed using the Freeman–Tukey discrepancy measure as a posterior predictive check. We are then able to predict distance errors for 884 galaxies in the NED-D catalogue and 203 galaxies in the HyperLEDA catalogue that do not report TF distance modulus errors. Our goal is that our estimated and predicted errors are used in catalogue-wide applications that require acknowledging the true variance of extragalactic distance measurements.


2001 ◽  
Vol 13 (7) ◽  
pp. 1649-1681 ◽  
Author(s):  
Masa-aki Sato

The Bayesian framework provides a principled way of model selection. This framework estimates a probability distribution over an ensemble of models, and the prediction is done by averaging over the ensemble of models. Accordingly, the uncertainty of the models is taken into account, and complex models with more degrees of freedom are penalized. However, integration over model parameters is often intractable, and some approximation scheme is needed. Recently, a powerful approximation scheme, called the variational bayes (VB) method, has been proposed. This approach defines the free energy for a trial probability distribution, which approximates a joint posterior probability distribution over model parameters and hidden variables. The exact maximization of the free energy gives the true posterior distribution. The VB method uses factorized trial distributions. The integration over model parameters can be done analytically, and an iterative expectation-maximization-like algorithm, whose convergence is guaranteed, is derived. In this article, we derive an online version of the VB algorithm and prove its convergence by showing that it is a stochastic approximation for finding the maximum of the free energy. By combining sequential model selection procedures, the online VB method provides a fully online learning method with a model selection mechanism. In preliminary experiments using synthetic data, the online VB method was able to adapt the model structure to dynamic environments.


Sign in / Sign up

Export Citation Format

Share Document