Integrated treatment of model and parameter uncertainties through a Bayesian approach

Author(s):  
Enrique López Droguett ◽  
Ali Mosleh

Bayesian and non-Bayesian approaches have been proposed for treating model uncertainty; in general, model and parameter uncertainties have been tackled as separate domains. This article discusses a Bayesian framework for an integrated assessment of model and parameter uncertainties. The approach accommodates cases involving multiple dependent models, and we also demonstrate that under certain conditions, the model uncertainty assessment approaches known as model averaging and uncertainty-factor are special cases of the proposed formulation. These features are also demonstrated by means of a few examples of interest in the risk and safety domain.

2019 ◽  
Vol 51 (02) ◽  
pp. 249-266
Author(s):  
Nicholas D. Payne ◽  
Berna Karali ◽  
Jeffrey H. Dorfman

AbstractBasis forecasting is important for producers and consumers of agricultural commodities in their risk management decisions. However, the best performing forecasting model found in previous studies varies substantially. Given this inconsistency, we take a Bayesian approach, which addresses model uncertainty by combining forecasts from different models. Results show model performance differs by location and forecast horizon, but the forecast from the Bayesian approach often performs favorably. In some cases, however, the simple moving averages have lower forecast errors. Besides the nearby basis, we also examine basis in a specific month and find that regression-based models outperform others in longer horizons.


Universe ◽  
2020 ◽  
Vol 6 (8) ◽  
pp. 109 ◽  
Author(s):  
David Kipping

The Simulation Argument posed by Bostrom suggests that we may be living inside a sophisticated computer simulation. If posthuman civilizations eventually have both the capability and desire to generate such Bostrom-like simulations, then the number of simulated realities would greatly exceed the one base reality, ostensibly indicating a high probability that we do not live in said base reality. In this work, it is argued that since the hypothesis that such simulations are technically possible remains unproven, statistical calculations need to consider not just the number of state spaces, but the intrinsic model uncertainty. This is achievable through a Bayesian treatment of the problem, which is presented here. Using Bayesian model averaging, it is shown that the probability that we are sims is in fact less than 50%, tending towards that value in the limit of an infinite number of simulations. This result is broadly indifferent as to whether one conditions upon the fact that humanity has not yet birthed such simulations, or ignore it. As argued elsewhere, it is found that if humanity does start producing such simulations, then this would radically shift the odds and make it very probably we are in fact simulated.


2021 ◽  
Author(s):  
Carlos R Oliveira ◽  
Eugene D Shapiro ◽  
Daniel M Weinberger

Vaccine effectiveness (VE) studies are often conducted after the introduction of new vaccines to ensure they provide protection in real-world settings. Although susceptible to confounding, the test-negative case-control study design is the most efficient method to assess VE post-licensure. Control of confounding is often needed during the analyses, which is most efficiently done through multivariable modeling. When a large number of potential confounders are being considered, it can be challenging to know which variables need to be included in the final model. This paper highlights the importance of considering model uncertainty by re-analyzing a Lyme VE study using several confounder selection methods. We propose an intuitive Bayesian Model Averaging (BMA) framework for this task and compare the performance of BMA to that of traditional single-best-model-selection methods. We demonstrate how BMA can be advantageous in situations when there is uncertainty about model selection by systematically considering alternative models and increasing transparency.


2021 ◽  
Author(s):  
Tomas Havranek ◽  
Roman Horvath ◽  
Ali Elminejad

The intertemporal substitution (Frisch) elasticity of labor supply governs the predictions of real business cycle models and models of taxation. We show that, for the extensive margin elasticity, two biases conspire to systematically produce large positive estimates when the elasticity is in fact zero. Among 723 estimates in 36 studies, the mean reported elasticity is 0.5. One half of that number is due to publication bias: larger estimates are reported preferentially. The other half is due to identification bias: studies with less exogenous time variation in wages report larger elasticities. Net of the biases, the literature implies a zero mean elasticity and, with 95% confidence, is inconsistent with calibrations above 0.25. To derive these results we collect 23 variables that reflect the context in which the elasticity was obtained, use nonlinear techniques to correct for publication bias, and employ Bayesian and frequentist model averaging to address model uncertainty.


1999 ◽  
Vol 64 (1) ◽  
pp. 55-70 ◽  
Author(s):  
Robert G. Aykroyd ◽  
David Lucy ◽  
A. Mark Pollard ◽  
Charlotte A. Roberts

It is generally assumed that life expectancy in antiquity was considerably shorter than it is now. In the limited number of cases where skeletal or dental age-at-death estimates have been made on adults for whom there are other reliable indications of age, there appears to be a clear systematic trend towards overestimating the age of young adults, and underestimating that of older individuals. We show that this might be a result of the use of regression-based techniques of analysis for converting age indicators into estimated ages. Whilst acknowledging the limitations of most age-at-death indicators in the higher age categories, we show that a Bayesian approach to converting age indicators into estimated age can reduce this trend of underestimation at the older end. We also show that such a Bayesian approach can always do better than regression-based methods in terms of giving a smaller average difference between predicted age and known age, and a smaller average 95-percent confidence interval width of the estimate. Given these observations, we suggest that Bayesian approaches to converting age indicators into age estimates deserve further investigation. In view of the generality and flexibility of the approach, we also suggest that similar algorithms may have a much wider application.


Author(s):  
Yanjun Zhang ◽  
Tingting Xia ◽  
Mian Li

Abstract Various types of uncertainties, such as parameter uncertainty, model uncertainty, metamodeling uncertainty may lead to low robustness. Parameter uncertainty can be either epistemic or aleatory in physical systems, which have been widely represented by intervals and probability distributions respectively. Model uncertainty is formally defined as the difference between the true value of the real-world process and the code output of the simulation model at the same value of inputs. Additionally, metamodeling uncertainty is introduced due to the usage of metamodels. To reduce the effects of uncertainties, robust optimization (RO) algorithms have been developed to obtain solutions being not only optimal but also less sensitive to uncertainties. Based on how parameter uncertainty is modeled, there are two categories of RO approaches: interval-based and probability-based. In real-world engineering problems, both interval and probabilistic parameter uncertainties are likely to exist simultaneously in a single problem. However, few works have considered mixed interval and probabilistic parameter uncertainties together with other types of uncertainties. In this work, a general RO framework is proposed to deal with mixed interval and probabilistic parameter uncertainties, model uncertainty, and metamodeling uncertainty simultaneously in design optimization problems using the intervals-of-statistics approaches. The consideration of multiple types of uncertainties will improve the robustness of optimal designs and reduce the risk of inappropriate decision-making, low robustness and low reliability in engineering design. Two test examples are utilized to demonstrate the applicability and effectiveness of the proposed RO approach.


2019 ◽  
Vol 220 (2) ◽  
pp. 1368-1378
Author(s):  
M Bertin ◽  
S Marin ◽  
C Millet ◽  
C Berge-Thierry

SUMMARY In low-seismicity areas such as Europe, seismic records do not cover the whole range of variable configurations required for seismic hazard analysis. Usually, a set of empirical models established in such context (the Mediterranean Basin, northeast U.S.A., Japan, etc.) is considered through a logic-tree-based selection process. This approach is mainly based on the scientist’s expertise and ignores the uncertainty in model selection. One important and potential consequence of neglecting model uncertainty is that we assign more precision to our inference than what is warranted by the data, and this leads to overly confident decisions and precision. In this paper, we investigate the Bayesian model averaging (BMA) approach, using nine ground-motion prediction equations (GMPEs) issued from several databases. The BMA method has become an important tool to deal with model uncertainty, especially in empirical settings with large number of potential models and relatively limited number of observations. Two numerical techniques, based on the Markov chain Monte Carlo method and the maximum likelihood estimation approach, for implementing BMA are presented and applied together with around 1000 records issued from the RESORCE-2013 database. In the example considered, it is shown that BMA provides both a hierarchy of GMPEs and an improved out-of-sample predictive performance.


Sign in / Sign up

Export Citation Format

Share Document