sources of uncertainty
Recently Published Documents





N. Simon Kwong ◽  
Kishor S. Jaiswal ◽  
Jack W. Baker ◽  
Nicolas Luco ◽  
Kristin A. Ludwig ◽  

2022 ◽  
Vol 26 (1) ◽  
pp. 197-220
Emixi Sthefany Valdez ◽  
François Anctil ◽  
Maria-Helena Ramos

Abstract. This study aims to decipher the interactions of a precipitation post-processor and several other tools for uncertainty quantification implemented in a hydrometeorological forecasting chain. We make use of four hydrometeorological forecasting systems that differ by how uncertainties are estimated and propagated. They consider the following sources of uncertainty: system A, forcing, system B, forcing and initial conditions, system C, forcing and model structure, and system D, forcing, initial conditions, and model structure. For each system's configuration, we investigate the reliability and accuracy of post-processed precipitation forecasts in order to evaluate their ability to improve streamflow forecasts for up to 7 d of forecast horizon. The evaluation is carried out across 30 catchments in the province of Quebec (Canada) and over the 2011–2016 period. Results are compared using a multicriteria approach, and the analysis is performed as a function of lead time and catchment size. The results indicate that the precipitation post-processor resulted in large improvements in the quality of forecasts with regard to the raw precipitation forecasts. This was especially the case when evaluating relative bias and reliability. However, its effectiveness in terms of improving the quality of hydrological forecasts varied according to the configuration of the forecasting system, the forecast attribute, the forecast lead time, and the catchment size. The combination of the precipitation post-processor and the quantification of uncertainty from initial conditions showed the best results. When all sources of uncertainty were quantified, the contribution of the precipitation post-processor to provide better streamflow forecasts was not remarkable, and in some cases, it even deteriorated the overall performance of the hydrometeorological forecasting system. Our study provides an in-depth investigation of how improvements brought by a precipitation post-processor to the quality of the inputs to a hydrological forecasting model can be cancelled along the forecasting chain, depending on how the hydrometeorological forecasting system is configured and on how the other sources of hydrological forecasting uncertainty (initial conditions and model structure) are considered and accounted for. This has implications for the choices users might make when designing new or enhancing existing hydrometeorological ensemble forecasting systems.

2022 ◽  
Felix Brandt ◽  
Christoph Jacob

While QM/MM studies of enzymatic reactions are widely used in computational chemistry, the results of such studies are subject to numerous sources of uncertainty, and the effect of different choices by the simulation scientist that are required when setting up QM/MM calculations is often unclear. In particular, the selection of the QM region is crucial for obtaining accurate and reliable results. Simply including amino acids by their distance to the active site is mostly not sufficient as necessary residues are missing or unimportant residues are included without evidence. Here, we take a first step towards quantifying uncertainties in QM/MM calculations by assessing the sensitivity of QM/MM reaction energies with respect to variations of the MM point charges. We show that such a point charge variation analysis (PCVA) can be employed to judge the accuracy of QM/MM reaction energies obtained with a selected QM region, and devise a protocol to systematically construct QM regions that minimize this uncertainty. We apply such a PCVA to the example of catechol \textit{O}-methyltransferase, and demonstrate that it provides a simple and reliable approach for the construction of the QM region. Our PCVA-based scheme is computationally efficient and requires only calculations for a system with a minimal QM region. Our work highlights the promise of applying methods of uncertainty quantification in computational chemistry.

2022 ◽  
Vol 9 ◽  
Luís M. Nunes

Here we compare bioaccumulation factors in marine organisms to partition ratios in marine debris for dichlorodiphenyltrichloroethane and polychlorinated biphenyls. Both organochlorines are synthetic persistent organic pollutants emitted into the environment since the beginning of the last century in approximately equal amounts. Their vast use and dispersion have resulted in approximately similar median concentrations of the two organochlorines in some pelagic organisms, namely in the liver and muscle tissue of fish. Molluscs, on the other hand, show higher median uptake of PCBs (median = 2.34 ng/g) than of DDTs (median = 1.70 ng/g), probably reflecting more localized conditions. We found that the bioaccumulation factors can be several orders of magnitude higher than the partition ratios. For instance, the median concentrations of organochlorines in the different matrices of fish, birds, and mammals are between one to four orders of magnitude higher than those found in marine debris, when lipid-normalized; or up to two orders of magnitude when measured as wet-weight. But, in molluscs, bioaccumulation/partition equals unity, which agrees with previous studies using passive samplers. Future research should focus on reducing sources of uncertainty by 1) homogenization of chemical procedures; 2) better assessment of chemical partition equilibrium between water and polymers in environmental conditions; 3) use of (multi)polymer passive samplers better aimed at mimicking uptake of specific living tissues.

2021 ◽  
Yishay Mansour ◽  
Alex Slivkins ◽  
Vasilis Syrgkanis ◽  
Zhiwei Steven Wu

In a wide range of recommendation systems, self-interested individuals (“agents”) make decisions over time, using information revealed by other agents in the past, and producing information that may help agents in the future. Each agent would like to exploit the best action given the current information but would prefer the previous agents to explore various alternatives to collect information. A social planner, by means of a well-designed recommendation policy, can incentivize the agents to balance exploration and exploitation in order to maximize social welfare or some other objective. The recommendation policy can be modeled as a multiarmed bandit algorithm under Bayesian incentivecompatibility (BIC) constraints. This line of work has received considerable attention in the “economics and computation” community. Although in prior work, the planner interacts with a single agent at a time, the present paper allows the agents to affect one another directly in a shared environment. The agents now face two sources of uncertainty: what is the environment, and what would the other agents do? We focus on “explorable” actions: those that can be recommended by some BIC policy. We show how the principal can identify and explore all such actions.

2021 ◽  
Vol 2021 ◽  
pp. 1-12
Huifang Niu ◽  
Jianchao Zeng ◽  
Hui Shi ◽  
Bin Wang ◽  
Tianye Liu

Estimation of the remaining useful life (RUL) is an important component of prognostics and health management (PHM). The accuracy of the RUL estimation for complex systems is mainly affected by three sources of uncertainty, i.e., the temporal uncertainty, the product-to-product uncertainty, and measurement errors. To improve PHM and account for the effects of the three sources of uncertainty, a nonlinear prognostic model with three sources of uncertainty is presented here. An approximated analytical expression for the probability density function (PDF) of the RUL is obtained based on the concept of first hitting time (FHT). Model parameters are then obtained by the expectation maximization (EM) algorithm, and the drift parameter is estimated adaptively using a Bayesian procedure. Finally, in order to illustrate the practical applications of the presented approach, a comparative study of real data on fatigue crack propagation is presented. Results demonstrate that our method improves model fit and increases the accuracy of the lifetime estimation.

2021 ◽  
Vol 6 (4) ◽  
pp. 54-61
Ekaterina A. Fofanova ◽  
Yulia N. Paveleva ◽  
Oksana A. Melnikova ◽  
Boris V. Belozerov ◽  
Natalia  Y. Konoshonkina ◽  

Background. The article presents a new approach to assessing the geological complexity — quantitative assessment of areal complexity, as well as an alternative methodology for assessing complexity in 1D. Aim. Developing a numerical metric for assessing the geological complexity and creating an algorithm for complexity maps construction. Materials and methods. Generally, complexity describe the reservoir in one number, that often underestimates the real complexity of the deposit. Geological complexity, presented in the article consists of 5 groups: structuraltectonic, facies-lithological, permeability and porosity, secondary alteration and fluid properties, 13 characteristics describe the complexity space of these groups. Each of these characteristics could be presented not only in 1D but also in 2D. The proposed methodology was tested on the company’s assets. Results. The presented examples of complexity maps for several fields show the advantage of 2D complexity estimation in comparison with 1D. The analysis of decomposed complexity estimation (for individual groups) on the company’s assets showed that the key groups of complexity are structural-tectonic, facies-lithological characteristics. Therefore, characteristics that describe these groups should be taken into account during the decision-making process and assets ranking. Conclusion. A methodology of quantitative assessment of areal geological complexity has been developed. This areal assessment allows identify the most “problematic” areas, analyzing existing sources of uncertainty, and also ranking and screening company assets when making strategic decisions.

James Robert Hunt ◽  
Martin Andrew Ebert ◽  
Pejman Rowshanfarzad ◽  
Hans Lynggaard Riis

Abstract Objective: The objective of this study was to separately quantify the stability of the megavoltage imager (MVI) and radiation head of an Elekta Unity MRL, throughout full gantry rotation. Approach: A ball-bearing (BB) phantom was attached to the radiation head of the Unity, while a single BB was placed at isocentre. Images were acquired during rotation, using the MVI. These images were processed using an in-house developed MATLAB program to reduce the errors resulted by noise, and the positions of the BBs in the images were analysed to extract MVI and radiation head sag data. Main results: The results returned by this method showed reproducibility, with a mean standard deviation of 7 µm for the position of BBs across all gantry angles. The radiation head was found to sag throughout rotation, with a maximum course of movement of 0.59 mm. The sag pattern was stable over a period greater than a year but showed some dependence on gantry rotation direction. Significance: As MRL is a relatively new system, it is promising to have data supporting the high level of precision on one Elekta Unity machine. Isolating and quantifying the sources of uncertainty in radiation delivery may allow more sophisticated analysis of how the system performance may be improved.

2021 ◽  
Robin Georg Claus Maack ◽  
Gerik Scheuermann ◽  
Hans Hagen ◽  
Jose Tiberio Hernández Peñaloza ◽  
Christina Gillmann

Abstract In many applications, Visual Analytics(VA) has developed into a standard tool to ease data access and knowledge generation. Unfortunately, many data sources, used in the VA process, are affected by uncertainty. In addition, the VA cycle itself can introduce uncertainty to the knowledge generation process. The classic VA cycle does not provide a mechanism to handle these sources of uncertainty. In this manuscript, we aim to provide an extended VA cycle that is capable of handling uncertainty by quantification, propagation, and visualization. Different data types and application scenarios that can be handled by such a cycle, examples, and a list of open challenges in the area of uncertainty-aware VA are provided.

2021 ◽  
Benjamin-Samuel Schlüter ◽  
Bruno Masquelier ◽  
Carlo Giovanni Camarda

Abstract Background: The COVID-19 pandemic has caused major shocks in mortality trends in many countries. Yet few studies have evaluated the heterogeneity of the mortality shock at the sub-national level, rigorously accounting for the different sources of uncertainty.Methods: Using death registration data from Belgium, we first assess the change in the heterogeneity of subnational standardized mortality ratios in 2020, when compared to previous years. We then measure the shock of the pandemic using district-level values of life expectancy, comparing the observed and projected districts life expectancy, accounting for all sources of uncertainty (related to the life-table construction at district level and to the projection methods at country and district level). The Bayesian modelling approach makes it easy to combine the different sources of uncertainty in the assessment of the shock. This is of particular interest at a finer geographical scale characterized by high stochastic variation in annual death counts.Results: The heterogeneity in the impact of the pandemic on all-cause mortality across districts is substantial, with some districts barely showing any impact whereas the Bruxelles-Capital and Mons districts experienced a decrease in life expectancy at birth of 2.24 (95% CI:1.33-3.05) and 2.10 (95% CI:0.86-3.30) years, respectively. The year 2020 was associated with an increase in mortality levels ' heterogeneity at a subnational level in comparison to past years measured by both the standardized mortality ratios and the life expectancies at birth. Decisions on uncertainty thresholds have a large bearing on the interpretation of the results.Conclusion: Developing sub-national mortality estimates with their uncertainty is key to understanding why certain areas have been hard hit in comparison to others.

Sign in / Sign up

Export Citation Format

Share Document