scholarly journals Statistical emulation of a tsunami model for sensitivity analysis and uncertainty quantification

2012 ◽  
Vol 12 (6) ◽  
pp. 2003-2018 ◽  
Author(s):  
A. Sarri ◽  
S. Guillas ◽  
F. Dias

Abstract. Due to the catastrophic consequences of tsunamis, early warnings need to be issued quickly in order to mitigate the hazard. Additionally, there is a need to represent the uncertainty in the predictions of tsunami characteristics corresponding to the uncertain trigger features (e.g. either position, shape and speed of a landslide, or sea floor deformation associated with an earthquake). Unfortunately, computer models are expensive to run. This leads to significant delays in predictions and makes the uncertainty quantification impractical. Statistical emulators run almost instantaneously and may represent well the outputs of the computer model. In this paper, we use the outer product emulator to build a fast statistical surrogate of a landslide-generated tsunami computer model. This Bayesian framework enables us to build the emulator by combining prior knowledge of the computer model properties with a few carefully chosen model evaluations. The good performance of the emulator is validated using the leave-one-out method.

2012 ◽  
Vol 134 (8) ◽  
Author(s):  
Dorin Drignei ◽  
Zissimos P. Mourelatos

Computer, or simulation, models are ubiquitous in science and engineering. Two research topics in building computer models, generally treated separately, are sensitivity analysis and computer model calibration. In sensitivity analysis, one quantifies the effect of each input factor on outputs, whereas in calibration, one finds the values of input factors that provide the best match to a set of test data. In this article, we show a connection between these two seemingly separate concepts for problems with transient signals. We use global sensitivity analysis for computer models with transient signals to screen out inactive input factors, thus making the calibration algorithm numerically more stable. We show that the computer model does not vary with respect to parameters having zero total sensitivity indices, indicating that such parameters are impossible to calibrate and must be screened out. Because the computer model can be computationally intensive, we construct a fast statistical surrogate of the computer model which is used for both sensitivity analysis and computer model calibration. We illustrate our approach with both a simple example and an automotive application involving a road load data acquisition (RLDA) computer model.


Author(s):  
Dorin Drignei ◽  
Zissimos Mourelatos ◽  
Zhen Hu

This paper addresses the sensitivity analysis of time-dependent computer models. Often, in practice, we partition the inputs into a subset of inputs relevant to the application studied, and a complement subset of nuisance inputs that are not of interest. We propose sensitivity measures for the relevant inputs of such dynamic computer models. The subset of nuisance inputs is used to create replication-type information to help quantify the uncertainty of sensitivity measures (or indices) for the relevant inputs. The method is first demonstrated on an analytical example. Then we use the proposed method in an application about the safety of restraint systems in light tactical vehicles. The method indicates that chest deflection curves are more sensitive to the addition of pretensioners and load limiters than to the type of seatbelt.


1997 ◽  
Vol 36 (04/05) ◽  
pp. 237-240
Author(s):  
P. Hammer ◽  
D. Litvack ◽  
J. P. Saul

Abstract:A computer model of cardiovascular control has been developed based on the response characteristics of cardiovascular control components derived from experiments in animals and humans. Results from the model were compared to those obtained experimentally in humans, and the similarities and differences were used to identify both the strengths and inadequacies of the concepts used to form the model. Findings were confirmatory of some concepts but contrary to some which are firmly held in the literature, indicating that understanding the complexity of cardiovascular control probably requires a combination of experiments and computer models which integrate multiple systems and allow for determination of sufficiency and necessity.


Sign in / Sign up

Export Citation Format

Share Document