Modelling automotive warranty claims with build-to-sale data uncertainty

2008 ◽  
Vol 2 (3) ◽  
pp. 179 ◽  
Author(s):  
Andre Kleyner ◽  
Keith Sanborn
Author(s):  
Jarkko P. P. Jääskelä ◽  
Anthony Yates

Author(s):  
Mythili K. ◽  
Manish Narwaria

Quality assessment of audiovisual (AV) signals is important from the perspective of system design, optimization, and management of a modern multimedia communication system. However, automatic prediction of AV quality via the use of computational models remains challenging. In this context, machine learning (ML) appears to be an attractive alternative to the traditional approaches. This is especially when such assessment needs to be made in no-reference (i.e., the original signal is unavailable) fashion. While development of ML-based quality predictors is desirable, we argue that proper assessment and validation of such predictors is also crucial before they can be deployed in practice. To this end, we raise some fundamental questions about the current approach of ML-based model development for AV quality assessment and signal processing for multimedia communication in general. We also identify specific limitations associated with the current validation strategy which have implications on analysis and comparison of ML-based quality predictors. These include a lack of consideration of: (a) data uncertainty, (b) domain knowledge, (c) explicit learning ability of the trained model, and (d) interpretability of the resultant model. Therefore, the primary goal of this article is to shed some light into mentioned factors. Our analysis and proposed recommendations are of particular importance in the light of significant interests in ML methods for multimedia signal processing (specifically in cases where human-labeled data is used), and a lack of discussion of mentioned issues in existing literature.


2014 ◽  
Vol 136 (3) ◽  
Author(s):  
Lei Shi ◽  
Ren-Jye Yang ◽  
Ping Zhu

The Bayesian metric was used to select the best available response surface in the literature. One of the major drawbacks of this method is the lack of a rigorous method to quantify data uncertainty, which is required as an input. In addition, the accuracy of any response surface is inherently unpredictable. This paper employs the Gaussian process based model bias correction method to quantify the data uncertainty and subsequently improve the accuracy of a response surface model. An adaptive response surface updating algorithm is then proposed for a large-scale problem to select the best response surface. The proposed methodology is demonstrated by a mathematical example and then applied to a vehicle design problem.


2002 ◽  
Vol 69 (2) ◽  
pp. 239 ◽  
Author(s):  
Eric Ghysels ◽  
Norman R. Swanson ◽  
Myles Callan

2021 ◽  
Vol 11 (14) ◽  
pp. 6499
Author(s):  
Matthias Frankl ◽  
Mathieu Hursin ◽  
Dimitri Rochman ◽  
Alexander Vasiliev ◽  
Hakim Ferroukhi

Presently, a criticality safety evaluation methodology for the final geological disposal of Swiss spent nuclear fuel is under development at the Paul Scherrer Institute in collaboration with the Swiss National Technical Competence Centre in the field of deep geological disposal of radioactive waste. This method in essence pursues a best estimate plus uncertainty approach and includes burnup credit. Burnup credit is applied by means of a computational scheme called BUCSS-R (Burnup Credit System for the Swiss Reactors–Repository case) which is complemented by the quantification of uncertainties from various sources. BUCSS-R consists in depletion, decay and criticality calculations with CASMO5, SERPENT2 and MCNP6, respectively, determining the keff eigenvalues of the disposal canister loaded with the Swiss spent nuclear fuel assemblies. However, the depletion calculation in the first and the criticality calculation in the third step, in particular, are subject to uncertainties in the nuclear data input. In previous studies, the effects of these nuclear data-related uncertainties on obtained keff values, stemming from each of the two steps, have been quantified independently. Both contributions to the overall uncertainty in the calculated keff values have, therefore, been considered as fully correlated leading to an overly conservative estimation of total uncertainties. This study presents a consistent approach eliminating the need to assume and take into account unrealistically strong correlations in the keff results. The nuclear data uncertainty quantification for both depletion and criticality calculation is now performed at once using one and the same set of perturbation factors for uncertainty propagation through the corresponding calculation steps of the evaluation method. The present results reveal the overestimation of nuclear data-related uncertainties by the previous approach, in particular for spent nuclear fuel with a high burn-up, and underline the importance of consistent nuclear data uncertainty quantification methods. However, only canister loadings with UO2 fuel assemblies are considered, not offering insights into potentially different trends in nuclear data-related uncertainties for mixed oxide fuel assemblies.


2012 ◽  
Vol 27 (3) ◽  
pp. 1503-1510 ◽  
Author(s):  
Yang Wang ◽  
Wenyuan Li ◽  
Peng Zhang ◽  
Bing Wang ◽  
Jiping Lu

Sign in / Sign up

Export Citation Format

Share Document