scholarly journals Update of Prior Probabilities by Minimal Divergence

Entropy ◽  
2021 ◽  
Vol 23 (12) ◽  
pp. 1668
Author(s):  
Jan Naudts

The present paper investigates the update of an empirical probability distribution with the results of a new set of observations. The update reproduces the new observations and interpolates using prior information. The optimal update is obtained by minimizing either the Hellinger distance or the quadratic Bregman divergence. The results obtained by the two methods differ. Updates with information about conditional probabilities are considered as well.

Knowing probability distributions for calculating expected values is always required in the engineering practice and other fields. Commonly, probability distributions are not always available. Moreover, the distribution type may not be reliably determined. In this case, an empirical distribution should be built directly from the observations. Therefore, the goal is to develop a methodology of accumulating and processing observation data so that the respective empirical distribution would be close enough to the unknown real distribution. For this, criteria regarding sufficiency of observations and the distribution validity are to be substantiated. As a result, a methodology is presente О.М. Мелкозьорова1, С.Г. Рассомахінd that considers the empirical probability distribution validity with respect to the parameter’s expected value. Values of the parameter are registered during a period of observations or measurements of the parameter. On this basis, empirical probabilities are calculated, where every next period the previous registration data are used as well. Every period gives an approximation to the parameter’s expected value using those empirical probabilities. The methodology using the moving averages and root-mean-square deviations asserts that the respective empirical distribution is valid (i.e., it is sufficiently close to the unknown real distribution) if the parameter’s expected value approximations become scattered very little for at least the three window multiple-of-2 widths by three successive windows. This criterion also implies the sufficiency of observation periods, although the sufficiency of observations per period is not claimed. The validity strongly depends on the volume of observations per period.


Open Physics ◽  
2012 ◽  
Vol 10 (3) ◽  
Author(s):  
Preety Aneja ◽  
Ramandeep Johal

AbstractThe thermal characteristics of a heat cycle are studied from a Bayesian approach. In this approach, we assign a certain prior probability distribution to an uncertain parameter of the system. Based on that prior, we study the expected behaviour of the system and it has been found that even in the absence of complete information, we obtain thermodynamic-like behaviour of the system. Two models of heat cycles, the quantum Otto cycle and the classical Otto cycle are studied from this perspective. Various expressions for thermal efficiences can be obtained with a generalised prior of the form Π(x) ∝ 1/x b. The predicted thermodynamic behaviour suggests a connection between prior information about the system and thermodynamic features of the system.


Author(s):  
Koichi Yamada ◽  

We propose a way to lean probabilistic causal models using conditional causal probabilities (CCPs) to represent uncertainty of causalities. The CCP is a probability devised by Peng and Reggia representing the uncertainty that a cause actually causes an effect given the cause. The main advantage of using CCPs is that they represent exact probabilities of causalities that people recognize mentally, and that the number of probabilities used in the causal model is far smaller than that of conditional probabilities by all combinations of possible causes. Thus, Peng and Reggia assumed that CCPs are given by human experts as subjective ones, and did not discuss how to calculate them from data when a dataset was available. We address this problem, starting from a discussion about properties of data frequently given in practical problems, and shows that prior probabilities that should be learned may differ from those derived by counting data. We then discuss and propose how to learn prior probabilities and CCPs from data, and evaluate the proposed method through numerical experiments and analyze results to show that the precision of leaned models is satisfactory.


Author(s):  
Zhenjia (Jerry) Huang ◽  
Yu Zhang

In wave basin model test of an offshore structure, waves that represent the given sea states have to be generated, qualified and accepted for the model test. We normally accept waves in wave calibration tests if the significant wave height, spectral peak period and spectrum match the specified target values. However, for model tests where the responses depend highly on the local wave motions (wave elevation and kinematics) such as wave impact on hull, green water impact on deck and air gap tests, additional qualification checks may be required. For instance, we may need to check wave crest probability distributions to avoid unrealistic wave crest in the test. To date, acceptance criteria of wave crest distribution calibration tests of large and steep waves of three-hour duration (full scale) have not been established. Two purposes of the work presented in the paper are: 1. to define and clarify the wave crest probability distribution of single realization (PDSR) and the probability distribution of wave crest for an ensemble of realizations (PDER) of a given sea state in order to use them appropriately; and 2. to develop semi-empirical probability distributions of nonlinear waves for both PDSR and PDER for easy, practical use. We found that in current practice ensemble and single realization distributions have the potential to be misinterpreted and misused. Clear understanding of the two kinds of distributions will help appropriate offshore design and production unit performance assessments. The semi-empirical formulas proposed in this paper were developed through regression analysis of crest distributions from a large number of sea states and realizations. Wave time series from potential flow simulations, computational fluid dynamics (CFD) simulations and model test results were used to establish the probability distributions. The nonlinear wave simulations were performed for three-hour duration assuming that they were long-crested. The sea states are assumed to be represented by JONSWAP spectrum, where a wide range of significant wave height, peak period, spectral peak parameter, and water depth were considered. Coefficients of the proposed semi-empirical probability distribution formulas, comparisons among crest distributions from numerical simulations and the semi-empirical formulas are presented in this paper.


2020 ◽  
Vol 39 (2) ◽  
pp. 102-109
Author(s):  
John Pendrel ◽  
Henk Schouten

It is common practice to make facies estimations from the outcomes of seismic inversions and their derivatives. Bayesian analysis methods are a popular approach to this. Facies are important indicators of hydrocarbon deposition and geologic processes. They are critical to geoscientists and engineers. The application of Bayes’ rule maps prior probabilities to posterior probabilities when given new evidence from observations. Per-facies elastic probability density functions (ePDFs) are constructed from elastic-log and rock-physics model crossplots, over which inversion results are superimposed. The ePDFs are templates for Bayesian analysis. In the context of reservoir characterization, the new information comes from seismic inversions. The results are volumes of the probabilities of occurrences of each of the facies at all points in 3D space. The concepts of Bayesian inference have been applied to the task of building low-frequency models for seismic inversions without well-log interpolation. Both a constant structurally compliant elastic trend approach and a facies-driven method, where models are constructed from per-facies trends and initial facies estimates, have been tested. The workflows make use of complete 3D prior information and measure and account for biases and uncertainties in the inversions and prior information. Proper accounting for these types of effects ensures that rock-physics models and inversion data prepared for reservoir property analysis are consistent. The effectiveness of these workflows has been demonstrated by using a Gulf of Mexico data set. We have shown how facies estimates can be effectively used to build reasonable low-frequency models for inversion, which obviate the need for well-log interpolation and provide full 3D variability. The results are more accurate probability-based net-pay estimates that correspond better to geology. We evaluate the workflows by using several measures including precision, confidence, and probabilistic net pay.


Sign in / Sign up

Export Citation Format

Share Document