A Sequential Calibration and Validation Framework for Model Parameter Updating and Bias Correction

2021 ◽  
Author(s):  
Chen Jiang ◽  
Zhen Hu ◽  
Yixuan Liu ◽  
Zissimos Mourelatos ◽  
David Gorsich ◽  
...  
Author(s):  
Chen Jiang ◽  
Yixuan Liu ◽  
Zhen Hu ◽  
Zissimos P. Mourelatos ◽  
David Gorsich ◽  
...  

Abstract Model parameter updating and bias correction plays an essential role in improving the validity of Modeling and Simulation (M&S) in engineering design and analysis. However, it is observed that the existing methods may either be misled by potentially wrong information if the computer model cannot adequately capture the underlying true physics, or be affected by the prior distributions of the unknown model parameters. In this paper, a sequential model calibration and validation (SeCAV) framework is proposed to improve the efficacy of both model parameter updating and model bias correction, where the model validation and Bayesian calibration are implemented in a sequential manner. In each iteration, the model validation assessment is employed as a filter to select the best experimental data for Bayesian calibration, and to update the prior distributions of uncertain model parameters for the next iteration. The calibrated parameters are then integrated with model bias correction to improve the prediction accuracy of the M&S. A mathematical example is employed to demonstrate the advantages of the SeCAV method.


2020 ◽  
Vol 142 (8) ◽  
Author(s):  
Alaa Olleak ◽  
Zhimin Xi

Abstract There are significant quality and reliability problems for components/products made by additive manufacturing (AM) due to various reasons. Selective laser melting (SLM) process is one of the popular AM techniques and it suffers from low quality and reliability issue as well. Among many reasons, the lack of accurate and efficient models to simulate the SLM process could be the most important one because reliability and quality quantification rely on accurate models; otherwise, a large number of experiments should be conducted for reliability and quality assurance. To date, modeling techniques for the SLM process are either computationally expensive based on finite element (FE) modeling or economically expensive requiring a significant amount of experiment data for data-driven modeling. This paper proposes the integration of FE and data-driven modeling with systematic calibration and validation framework for the SLM process based on limited experiment data. Multi-fidelity models are the FE model for the SLM process and a machine learning model constructed based on the FE model instead of real experiment data. The machine learning model, after incorporation of the learned physics from the FE model, is then further improved based on limited real experiment data through the calibration and validation framework. The proposed work enables the development of highly efficient and accurate models for melt pool prediction of the SLM process under various configurations. The effectiveness of the framework is demonstrated by real experiment data under 14 different printing configurations.


2020 ◽  
Vol 12 (13) ◽  
pp. 2102
Author(s):  
Pari-Sima Katiraie-Boroujerdy ◽  
Matin Rahnamay Naeini ◽  
Ata Akbari Asanjan ◽  
Ali Chavoshian ◽  
Kuo-lin Hsu ◽  
...  

High-resolution real-time satellite-based precipitation estimation datasets can play a more essential role in flood forecasting and risk analysis of infrastructures. This is particularly true for extended deserts or mountainous areas with sparse rain gauges like Iran. However, there are discrepancies between these satellite-based estimations and ground measurements, and it is necessary to apply adjustment methods to reduce systematic bias in these products. In this study, we apply a quantile mapping method with gauge information to reduce the systematic error of the Precipitation Estimation from Remotely Sensed Information using Artificial Neural Networks-Cloud Classification System (PERSIANN-CCS). Due to the availability and quality of the ground-based measurements, we divide Iran into seven climate regions to increase the sample size for generating cumulative probability distributions within each region. The cumulative distribution functions (CDFs) are then employed with a quantile mapping 0.6° × 0.6° filter to adjust the values of PERSIANN-CCS. We use eight years (2009–2016) of historical data to calibrate our method, generating nonparametric cumulative distribution functions of ground-based measurements and satellite estimations for each climate region, as well as two years (2017–2018) of additional data to validate our approach. The results show that the bias correction approach improves PERSIANN-CCS data at aggregated to monthly, seasonal and annual scales for both the calibration and validation periods. The areal average of the annual bias and annual root mean square errors are reduced by 98% and 56% during the calibration and validation periods, respectively. Furthermore, the averages of the bias and root mean square error of the monthly time series decrease by 96% and 26% during the calibration and validation periods, respectively. There are some limitations in bias correction in the Southern region of the Caspian Sea because of shortcomings of the satellite-based products in recognizing orographic clouds.


2020 ◽  
Author(s):  
Hadush Meresa ◽  
Conor Murphy ◽  
Rowan Fealy ◽  
Saeed Golian

Abstract. The assessment of future impacts of climate change is associated with a cascade of uncertainty linked to the modelling chain employed in assessing local scale changes. Understanding and quantifying this cascade is essential to developing effective adaptation actions. We evaluate and quantify uncertainties in future flood quantiles associated with climate change for four Irish catchments, incorporating within our modelling chain uncertainties associated with 12 Global Climate Models contained in the Coupled Model Inter-comparison Project Phase 6, five different bias correction approaches, hydrological model parameter uncertainty and use of three different extreme value distributions for flood frequency analysis. Results indicate increased flood risk in all catchments for different Shared Socioeconomic Pathways (SSPs), with changes in flooding related to changes in annual maximum precipitation. We use a sensitivity test based on the analysis of variance (ANOVA) to decompose uncertainties and their interactions in estimating selected flood quantiles in the 2080s for each catchment. We find that the dominant sources of uncertainty vary between catchments, calling into question the ability to generalise about the importance of different components of the cascade of uncertainty in future flood risk. For two of our catchments, uncertainties associated with bias correction methods and extreme value distributions outweigh the uncertainty associated with the ensemble of climate models. For all catchments and flood quantiles examined, hydrological model parameter uncertainty is the least important component of our modelling chain, while the uncertainties derived from the interaction of components are substantial (>20 percent of overall uncertainty in two catchments). While our sample is small, there is evidence that the dominant components of the cascade of uncertainty may be linked to catchment characteristics and rainfall runoff processes. Future work that seeks to further explore the dominant components of uncertainty as they relate to catchment characteristics may provide insight into a priori identifying the key components of modelling chains to be included in climate change impact assessments.


2019 ◽  
Vol 176 ◽  
pp. 19-32 ◽  
Author(s):  
C. Kale ◽  
P. Garg ◽  
B. Gholami Bazehhour ◽  
S. Srinivasan ◽  
M.A. Bhatia ◽  
...  

2020 ◽  
Vol 368 ◽  
pp. 113172 ◽  
Author(s):  
Chen Jiang ◽  
Zhen Hu ◽  
Yixuan Liu ◽  
Zissimos P. Mourelatos ◽  
David Gorsich ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document