Decision Analytic and Bayesian Uncertainty Quantification for Decision Support

Author(s):  
D. Warner North
2020 ◽  
Vol 81 (8) ◽  
pp. 1588-1596 ◽  
Author(s):  
Peter Reichert

Abstract Uncertainty quantification is very important in environmental management to allow decision makers to consider the reliability of predictions of the consequences of decision alternatives and relate them to their risk attitudes and the uncertainty about their preferences. Nevertheless, uncertainty quantification in environmental decision support is often incomplete and the robustness of the results regarding assumptions made for uncertainty quantification is often not investigated. In this article, an attempt is made to demonstrate how uncertainty can be considered more comprehensively in environmental research and decision support by combining well-established with rarely applied statistical techniques. In particular, the following elements of uncertainty quantification are discussed: (i) using stochastic, mechanistic models that consider and propagate uncertainties from their origin to the output; (ii) profiting from the support of modern techniques of data science to increase the diversity of the exploration process, to benchmark mechanistic models, and to find new relationships; (iii) analysing structural alternatives by multi-model and non-parametric approaches; (iv) quantitatively formulating and using societal preferences in decision support; (v) explicitly considering the uncertainty of elicited preferences in addition to the uncertainty of predictions in decision support; and (vi) explicitly considering the ambiguity about prior distributions for predictions and preferences by using imprecise probabilities. In particular, (v) and (vi) have mostly been ignored in the past and a guideline is provided on how these uncertainties can be considered without significantly increasing the computational burden. The methodological approach to (v) and (vi) is based on expected expected utility theory, which extends expected utility theory to the consideration of uncertain preferences, and on imprecise, intersubjective Bayesian probabilities.


2020 ◽  
Vol 8 ◽  
Author(s):  
Brioch Hemmings ◽  
Matthew J. Knowling ◽  
Catherine R. Moore

Effective decision making for resource management is often supported by combining predictive models with uncertainty analyses. This combination allows quantitative assessment of management strategy effectiveness and risk. Typically, history matching is undertaken to increase the reliability of model forecasts. However, the question of whether the potential benefit of history matching will be realized, or outweigh its cost, is seldom asked. History matching adds complexity to the modeling effort, as information from historical system observations must be appropriately blended with the prior characterization of the system. Consequently, the cost of history matching is often significant. When it is not implemented appropriately, history matching can corrupt model forecasts. Additionally, the available data may offer little decision-relevant information, particularly where data and forecasts are of different types, or represent very different stress regimes. In this paper, we present a decision support modeling workflow where early quantification of model uncertainty guides ongoing model design and deployment decisions. This includes providing justification for undertaking (or forgoing) history matching, so that unnecessary modeling costs can be avoided and model performance can be improved. The workflow is demonstrated using a regional-scale modeling case study in the Wairarapa Valley (New Zealand), where assessments of stream depletion and nitrate-nitrogen contamination risks are used to support water-use and land-use management decisions. The probability of management success/failure is assessed by comparing the proximity of model forecast probability distributions to ecologically motivated decision thresholds. This study highlights several important insights that can be gained by undertaking early uncertainty quantification, including: i) validation of the prior numerical characterization of the system, in terms of its consistency with historical observations; ii) validation of model design or indication of areas of model shortcomings; iii) evaluation of the relative proximity of management decision thresholds to forecast probability distributions, providing a justifiable basis for stopping modeling.


2020 ◽  
Vol 5 ◽  
pp. A477
Author(s):  
Florian Künzner ◽  
Tobias Neckel ◽  
Hans-Joachim Bungartz ◽  
Felix Dietrich ◽  
Gerta Köster

It is difficult to provide live simulation systems for decision support. Time is limited and uncertainty quantification requires many simulation runs. We combine a surrogate model with the stochastic collocation method to overcome time and storage restrictions and show a proof of concept for a de-boarding scenario of a train.


2013 ◽  
Vol 46 (2) ◽  
pp. 52
Author(s):  
CHRISTOPHER NOTTE ◽  
NEIL SKOLNIK

Sign in / Sign up

Export Citation Format

Share Document