epistemic uncertainty
Recently Published Documents


TOTAL DOCUMENTS

754
(FIVE YEARS 318)

H-INDEX

41
(FIVE YEARS 8)

Author(s):  
Johannes Mattes

Abstract This paper examines cave environments as unique spaces of knowledge production and shows how visualizations of natural cavities in maps came to be powerful tools in scientific reasoning. Faced with the challenge of limited vision, mapmakers combined empiricism and imagination in an experimental setting and developed specific translation strategies to deal with the uncertain origin of underground objects and the shifting boundaries between the known and the unknown. By deconstructing this type of cartographic representation, which has barely been studied, this paper furnishes surprising insights into the scholarly practices and tools used to deal with this considerable epistemic uncertainty and to signal credibility and trust to potential users. The array of maps used for this study includes both archival and published sources, depicting caves in Europe, America and Siberia.


Author(s):  
Qiuhan Wang ◽  
Mei Cai ◽  
Wei Guo

Abstract The increasing frequency and severity of Natech accidents warn us to investigate the occurrence mechanism of these events. Cascading disasters chain magnifies the impact of natural hazards due to its propagation through critical infrastructures and socio-economic networks. In order to manipulate imprecise probabilities of cascading events in Natech scenarios, this work proposes an improved Bayesian network (BN) combining with evidence theory to better deal with epistemic uncertainty in Natech accidents than traditional BNs. Effective inference algorithms have been developed to propagate system faulty in a socio-economic system. The conditional probability table (CPT) of BN in the traditional probability approach is modified by utilizing an OR/AND gate to obtain the belief mass propagation in the framework of evidence theory. Our improved Bayesian network methodology makes it possible to assess the impact and damage of Natech accidents under the environment of complex interdependence among accidents with insufficient data. Finally, a case study of Guangdong province, an area prone to natural disasters, is given. The modified Bayesian network is carried out to analyze this area’s Natech scenario. After diagnostic analysis and sensitivity analysis of human factors and the natural factor, we are able to locate the key nodes in the cascading disaster chain. Findings can provide useful theoretical support for urban managers of industrial cities to enhance disaster prevention and mitigation ability.


2021 ◽  
pp. 030631272110625
Author(s):  
Robert Evans

This article presents a preliminary analysis of the advice provided by the UK government’s Scientific Advisory Group for Emergencies (SAGE) held between 22 January and 23 March 2020 in response to the emerging coronavirus pandemic. Drawing on the published minutes of the group’s meetings, the article examines what was known and not known, the assumptions and working practices that shaped their work, and how this knowledge was reflected in the decisions made by the government. In doing so, the article critically examines what it means for policy making to be ‘led by the science’ when the best available science is provisional and uncertain. Using ideas of ‘externality’ and ‘evidential significance’, the article argues that the apparent desire for high levels of certainty by both scientists and political decision-makers made early action impossible as the data needed were not, and could not be, available in time. This leads to an argument for changes to the institutions that provide scientific advice based on sociologically informed expectations of science in which expert judgement plays a more significant role.


2021 ◽  
Vol 27 (12) ◽  
pp. 1347-1370
Author(s):  
Ekaterina Auer ◽  
Wolfram Luther

In this paper, we consider genetic risk assessment and genetic counseling for breast cancer from the point of view of reliable uncertainty handling. In medical practice, there exist fairly accurate numerical tools predicting breast cancer (or gene mutation) probability based on such factors as the family history of a patient. However, they are too complex to be applied in normal doctors’ offices, so that several simplified, questionnaire-type support tools appeared. This process is highly affected by uncertainty. At the same time, reliability of test interpretations and counseling conclusions is especially important since they have direct influence on humans and their decisions. We show how expert opinions on mutation probabilities can be combined using the Dempster-Shafer theory. Based on multi-criteria binary decision trees and interval analysis, we combine the referral screening tool designed to determine patients at risk of breast cancer (and recommend genetic counseling or testing for them) with three further risk assessment tools available for this purpose. A patient’s confidence in the outcome of a genetic counseling session can be heightened by the proposed method since it combines different sources to provide score ranges leading to more information. Finally, based on this approach, a decision tree for assigning a risk category is proposed which enhances the existing methodology. The great impact of epistemic uncertainty is reflected through large overlapping intervals for the risk classes.


2021 ◽  
pp. 875529302110560
Author(s):  
Yousef Bozorgnia ◽  
Norman A Abrahamson ◽  
Sean K Ahdi ◽  
Timothy D Ancheta ◽  
Linda Al Atik ◽  
...  

This article summarizes the Next Generation Attenuation (NGA) Subduction (NGA-Sub) project, a major research program to develop a database and ground motion models (GMMs) for subduction regions. A comprehensive database of subduction earthquakes recorded worldwide was developed. The database includes a total of 214,020 individual records from 1,880 subduction events, which is by far the largest database of all the NGA programs. As part of the NGA-Sub program, four GMMs were developed. Three of them are global subduction GMMs with adjustment factors for up to seven worldwide regions: Alaska, Cascadia, Central America and Mexico, Japan, New Zealand, South America, and Taiwan. The fourth GMM is a new Japan-specific model. The GMMs provide median predictions, and the associated aleatory variability, of RotD50 horizontal components of peak ground acceleration, peak ground velocity, and 5%-damped pseudo-spectral acceleration (PSA) at oscillator periods ranging from 0.01 to 10 s. Three GMMs also quantified “within-model” epistemic uncertainty of the median prediction, which is important in regions with sparse ground motion data, such as Cascadia. In addition, a damping scaling model was developed to scale the predicted 5%-damped PSA of horizontal components to other damping ratios ranging from 0.5% to 30%. The NGA-Sub flatfile, which was used for the development of the NGA-Sub GMMs, and the NGA-Sub GMMs coded on various software platforms, have been posted for public use.


2021 ◽  
Vol 50 (4) ◽  
pp. 607-626
Author(s):  
Egidijus Rytas Vaidogas

Two alternative Bayesian approaches are proposed for the prediction of fragmentation of pressure vessels triggered off by accidental explosions (bursts) of these containment structures. It is shown how to carry out this prediction with post-mortem data on fragment numbers counted after past explosion accidents. Results of the prediction are estimates of probabilities of individual fragment numbers. These estimates are expressed by means of Bayesian prior or posterior distributions. It is demonstrated how to elicit the prior distributions from relatively scarce post-mortem data on vessel fragmentations. Specifically, it is suggested to develop priors with two Bayesian models known as compound Poisson-gamma and multinomial-Dirichlet probability distributions. The available data is used to specify non-informative prior for Poisson parameter that is subsequently transformed into priors of individual fragment number probabilities. Alternatively, the data is applied to a specification of Dirichlet concentration parameters. The latter priors directly express epistemic uncertainty in the fragment number probabilities. Example calculations presented in the study demonstrate that the suggested non-informative prior distributions are responsive to updates with scarce data on vessel explosions. It is shown that priors specified with Poisson-gamma and multinomial-Dirichlet models differ tangibly; however, this difference decreases with increasing amount of new data. For the sake of brevity and concreteness, the study was limited to fire induced vessel bursts known as boiling liquid expanding vapour explosions (BLEVEs).


Author(s):  
Berkcan Kapusuzoglu ◽  
Paromita Nath ◽  
Matthew Sato ◽  
Sankaran Mahadevan ◽  
Paul Witherell

Abstract This work presents a data-driven methodology for multi-objective optimization under uncertainty of process parameters in the fused filament fabrication (FFF) process. The proposed approach optimizes the process parameters with the objectives of minimizing the geometric inaccuracy and maximizing the filament bond quality of the manufactured part. First, experiments are conducted to collect data pertaining to the part quality. Then, Bayesian neural network (BNN) models are constructed to predict the geometric inaccuracy and bond quality as functions of the process parameters. The BNN model captures the model uncertainty caused by the lack of knowledge about model parameters (neuron weights) and the input variability due to the intrinsic randomness in the input parameters. Using the stochastic predictions from these models, different robustness-based design optimization formulations are investigated, wherein process parameters such as nozzle temperature, nozzle speed, and layer thickness are optimized under uncertainty for different multi-objective scenarios. Epistemic uncertainty in the prediction model and the aleatory uncertainty in the input are considered in the optimization. Finally, Pareto surfaces are constructed to estimate the trade-offs between the objectives. Both the BNN models and the effectiveness of the proposed optimization methodology are validated using actual manufacturing of the parts.


Author(s):  
Jonas Busk ◽  
Peter Bjørn Jørgensen ◽  
Arghya Bhowmik ◽  
Mikkel N. Schmidt ◽  
Ole Winther ◽  
...  

Abstract Data-driven methods based on machine learning have the potential to accelerate computational analysis of atomic structures. In this context, reliable uncertainty estimates are important for assessing confidence in predictions and enabling decision making. However, machine learning models can produce badly calibrated uncertainty estimates and it is therefore crucial to detect and handle uncertainty carefully. In this work we extend a message passing neural network designed specifically for predicting properties of molecules and materials with a calibrated probabilistic predictive distribution. The method presented in this paper differs from previous work by considering both aleatoric and epistemic uncertainty in a unified framework, and by recalibrating the predictive distribution on unseen data. Through computer experiments, we show that our approach results in accurate models for predicting molecular formation energies with well calibrated uncertainty in and out of the training data distribution on two public molecular benchmark datasets, QM9 and PC9. The proposed method provides a general framework for training and evaluating neural network ensemble models that are able to produce accurate predictions of properties of molecules with well calibrated uncertainty estimates.


2021 ◽  
pp. 875529302110552
Author(s):  
Mario Ordaz ◽  
Danny Arroyo

The current practice of Probabilistic Seismic Hazard Analysis (PSHA) considers the inclusion of epistemic uncertainties involved in different parts of the analysis via the logic-tree approach. Given the complexity of modern PSHA models, numerous branches are needed, which in some cases leads to concerns regarding performance issues. We introduce the use of a magnitude exceedance rate which, following Bayesian conventions, we call predictive exceedance rate. This rate is the original Gutenberg–Richter relation after having included the effect of the epistemic uncertainty in parameter β. The predictive exceedance rate was first proposed by Campbell but to our best knowledge is seldom used in current PSHA. We show that the predictive exceedance rate is as accurate as the typical logic-tree approach but allows for much faster computations, a very useful property given the complexity of some modern PSHA models.


Sign in / Sign up

Export Citation Format

Share Document