Experimental Assessment, Model Validation, and Uncertainty Quantification of a Pilot-Scale Gasifier

2016 ◽  
Vol 55 (25) ◽  
pp. 6961-6970 ◽  
Author(s):  
M. Hossein Sahraei ◽  
Marc A. Duchesne ◽  
Robin W. Hughes ◽  
Luis A. Ricardez-Sandoval
2021 ◽  
Vol 191 ◽  
pp. 106891
Author(s):  
Ângelo Casaleiro ◽  
Rodrigo Amaro e Silva ◽  
Bruno Teixeira ◽  
João M Serra

Author(s):  
George A. Hazelrigg ◽  
Georgia-Ann Klutke

Abstract The purpose of this paper is not to present new results; rather, it is to show that the current approach to model validation is not consistent with the accepted mathematics of probability theory. Specifically, we argue that the Sandia V&V Challenge Problem is ill-posed in that the answers sought do not, mathematically, exist. We apply our arguments to show the types of mistakes present in the papers presented in the Journal of Verification, Validation and Uncertainty Quantification, Volume 1,1 along with the challenge problem. Further, we argue that, when the problem is properly posed, both the applicable methodology and the solution techniques are easily drawn from the well-developed mathematics of probability and decision theory. The unfortunate aspect of the challenge problem as currently stated is that it leads to incorrect and inappropriate mathematical approaches that should be avoided and corrected in the current literature.


2016 ◽  
Vol 97 (2) ◽  
pp. 427-449
Author(s):  
Weston M. Eldredge ◽  
Pál Tóth ◽  
Laurie Centauri ◽  
Eric G. Eddings ◽  
Kerry E. Kelly ◽  
...  

2002 ◽  
Vol 128 (6) ◽  
pp. 522-532 ◽  
Author(s):  
Jae-Hong Kim ◽  
Jason L. Rennecker ◽  
Robert B. Tomiak ◽  
Benito J. Mariñas ◽  
Richard J. Miltner ◽  
...  

2012 ◽  
Vol 134 (3) ◽  
Author(s):  
Zhenfei Zhan ◽  
Yan Fu ◽  
Ren-Jye Yang ◽  
Yinghong Peng

Validation of computational models with multiple, repeated, and correlated functional responses for a dynamic system requires the consideration of uncertainty quantification and propagation, multivariate data correlation, and objective robust metrics. This paper presents a new method of model validation under uncertainty to address these critical issues. Three key technologies of this new method are uncertainty quantification and propagation using statistical data analysis, probabilistic principal component analysis (PPCA), and interval-based Bayesian hypothesis testing. Statistical data analysis is used to quantify the variabilities of the repeated tests and computer-aided engineering (CAE) model results. The differences between the mean values of test and CAE data are extracted as validation features, and the PPCA is employed to handle multivariate correlation and to reduce the dimension of the multivariate difference curves. The variabilities of the repeated test and CAE data are propagated through the data transformation to the PPCA space. In addition, physics-based thresholds are defined and transformed to the PPCA space. Finally, interval-based Bayesian hypothesis testing is conducted on the reduced difference data to assess the model validity under uncertainty. A real-world dynamic system example which has one set of the repeated test data and two stochastic CAE models is used to demonstrate this new approach.


Sign in / Sign up

Export Citation Format

Share Document