Uncertainty Quantification of Complement Sensitivity Indices in Dynamic Computer Models

Author(s):  
Dorin Drignei ◽  
Zissimos Mourelatos ◽  
Zhen Hu

This paper addresses the sensitivity analysis of time-dependent computer models. Often, in practice, we partition the inputs into a subset of inputs relevant to the application studied, and a complement subset of nuisance inputs that are not of interest. We propose sensitivity measures for the relevant inputs of such dynamic computer models. The subset of nuisance inputs is used to create replication-type information to help quantify the uncertainty of sensitivity measures (or indices) for the relevant inputs. The method is first demonstrated on an analytical example. Then we use the proposed method in an application about the safety of restraint systems in light tactical vehicles. The method indicates that chest deflection curves are more sensitive to the addition of pretensioners and load limiters than to the type of seatbelt.

2012 ◽  
Vol 134 (8) ◽  
Author(s):  
Dorin Drignei ◽  
Zissimos P. Mourelatos

Computer, or simulation, models are ubiquitous in science and engineering. Two research topics in building computer models, generally treated separately, are sensitivity analysis and computer model calibration. In sensitivity analysis, one quantifies the effect of each input factor on outputs, whereas in calibration, one finds the values of input factors that provide the best match to a set of test data. In this article, we show a connection between these two seemingly separate concepts for problems with transient signals. We use global sensitivity analysis for computer models with transient signals to screen out inactive input factors, thus making the calibration algorithm numerically more stable. We show that the computer model does not vary with respect to parameters having zero total sensitivity indices, indicating that such parameters are impossible to calibrate and must be screened out. Because the computer model can be computationally intensive, we construct a fast statistical surrogate of the computer model which is used for both sensitivity analysis and computer model calibration. We illustrate our approach with both a simple example and an automotive application involving a road load data acquisition (RLDA) computer model.


2012 ◽  
Vol 12 (6) ◽  
pp. 2003-2018 ◽  
Author(s):  
A. Sarri ◽  
S. Guillas ◽  
F. Dias

Abstract. Due to the catastrophic consequences of tsunamis, early warnings need to be issued quickly in order to mitigate the hazard. Additionally, there is a need to represent the uncertainty in the predictions of tsunami characteristics corresponding to the uncertain trigger features (e.g. either position, shape and speed of a landslide, or sea floor deformation associated with an earthquake). Unfortunately, computer models are expensive to run. This leads to significant delays in predictions and makes the uncertainty quantification impractical. Statistical emulators run almost instantaneously and may represent well the outputs of the computer model. In this paper, we use the outer product emulator to build a fast statistical surrogate of a landslide-generated tsunami computer model. This Bayesian framework enables us to build the emulator by combining prior knowledge of the computer model properties with a few carefully chosen model evaluations. The good performance of the emulator is validated using the leave-one-out method.


2021 ◽  
pp. 126268
Author(s):  
Menberu B. Meles ◽  
Dave C. Goodrich ◽  
Hoshin V. Gupta ◽  
I. Shea Burns ◽  
Carl L. Unkrich ◽  
...  

Algorithms ◽  
2020 ◽  
Vol 13 (7) ◽  
pp. 162
Author(s):  
Marion Gödel ◽  
Rainer Fischer ◽  
Gerta Köster

Microscopic crowd simulation can help to enhance the safety of pedestrians in situations that range from museum visits to music festivals. To obtain a useful prediction, the input parameters must be chosen carefully. In many cases, a lack of knowledge or limited measurement accuracy add uncertainty to the input. In addition, for meaningful parameter studies, we first need to identify the most influential parameters of our parametric computer models. The field of uncertainty quantification offers standardized and fully automatized methods that we believe to be beneficial for pedestrian dynamics. In addition, many methods come at a comparatively low cost, even for computationally expensive problems. This allows for their application to larger scenarios. We aim to identify and adapt fitting methods to microscopic crowd simulation in order to explore their potential in pedestrian dynamics. In this work, we first perform a variance-based sensitivity analysis using Sobol’ indices and then crosscheck the results by a derivative-based measure, the activity scores. We apply both methods to a typical scenario in crowd simulation, a bottleneck. Because constrictions can lead to high crowd densities and delays in evacuations, several experiments and simulation studies have been conducted for this setting. We show qualitative agreement between the results of both methods. Additionally, we identify a one-dimensional subspace in the input parameter space and discuss its impact on the simulation. Moreover, we analyze and interpret the sensitivity indices with respect to the bottleneck scenario.


2021 ◽  
Author(s):  
Michael Prime ◽  
Gavin Jones ◽  
Vicente Romero ◽  
Justin Winokur ◽  
Benjamin Schroeder

Sign in / Sign up

Export Citation Format

Share Document