scholarly journals Quantifying Uncertainty in Analytical Measurements

2018 ◽  
Vol 41 (2) ◽  
pp. 145-163 ◽  
Author(s):  
SB Rasul ◽  
A Monsur Kajal ◽  
AH Khan

In providing chemical, biochemical and agricultural materials testing services for quality specification, the analytical chemists are increasingly required to address the fundamental issues related to the modern concepts of Chemical Metrology such as Method Validation, Traceability and Uncertainty of Measurements. Without this knowledge, the results cannot be recognized as a scientific fact with defined level of acceptability. According to ISO/IEC 17025:2005, this is an essential requirement of all testing laboratories to attain competence to test materials for the desired purpose. of these three concepts of chemical metrology, the most complex is the calculation of uncertainties from different sources associated with a single measurement and incorporate them into the final result(s) as the expanded uncertainty(UE) with a defined level of reliability (e.g., at 95% CL). In this paper the concepts and practice of uncertainty calculation in analytical measurements are introduced by using the principles of statistics. The calculation procedure indentifies the primary sources of uncertainties and quantifies their respective contributions to the total uncertainty of the final results. The calculations are performed by using experimental data of Lead (Pb) analysis in soil by GF-AAS and pesticides analysis in wastewater by GC-MS method. The final result of the analytical measurement is expressed as: Result (mg/kg) = Measured Value of Analyte (mg/kg) ± Uncertainty (mg/kg), where the uncertainty is the parametric value associated with individual steps in measurements such as sample weighing(Um), extraction of analyte (Ue) (Pb from soil or pesticides from water), volumetry in measurement (Uv), concentration calibration(Ux), etc. The propagation of these individual uncertainties from different sources is expressed as combined relative uncertainty (Uc), which is calculated by using the formula:Combined uncertainty Uc/c = {(Ux/x)2+(Um/m)2+(Uv/v)2+(Ue/e)2+…}1/2The overall uncertainty associated with the final result of the analyte is expressed as Expanded Uncertainty (UE) at certain level of confidence (e.g. 95%). The Expanded Uncertainty is calculated by multiplication of Combined Uncertainty (Uc) with a coverage factor (K) according to the proposition of level of confidence. In general, the level of confidence for enormous data is considered at 95%, CL where K is 2. Hence, the final result of the analyte is expressed as: X ± UE (unit) at 95% CL, where UE = 2Uc.Journal of Bangladesh Academy of Sciences, Vol. 41, No. 2, 145-163, 2017

2017 ◽  
Vol 68 (11) ◽  
pp. 2482-2487
Author(s):  
George Lazar ◽  
Claudiu Campureanu ◽  
Ioan Cirneanu ◽  
Danut Ionel Vaireanu

This paper intends to present the theoretical background as well as practical illustrations for good laboratory practices in conductivity measurements, ways to increase the accuracy of conductivity measurements as well as how one may evaluate the uncertainty of conductivity measurements for the electrolyte solutions. Practical measurements for prepared standards of 1 M KCl and 0.1 M KCl solutions are carried out and the values of repeatability, composed uncertainty and expanded uncertainty are presented.


2021 ◽  
Author(s):  
Vu-Linh Nguyen ◽  
Mohammad Hossein Shaker ◽  
Eyke Hüllermeier

AbstractVarious strategies for active learning have been proposed in the machine learning literature. In uncertainty sampling, which is among the most popular approaches, the active learner sequentially queries the label of those instances for which its current prediction is maximally uncertain. The predictions as well as the measures used to quantify the degree of uncertainty, such as entropy, are traditionally of a probabilistic nature. Yet, alternative approaches to capturing uncertainty in machine learning, alongside with corresponding uncertainty measures, have been proposed in recent years. In particular, some of these measures seek to distinguish different sources and to separate different types of uncertainty, such as the reducible (epistemic) and the irreducible (aleatoric) part of the total uncertainty in a prediction. The goal of this paper is to elaborate on the usefulness of such measures for uncertainty sampling, and to compare their performance in active learning. To this end, we instantiate uncertainty sampling with different measures, analyze the properties of the sampling strategies thus obtained, and compare them in an experimental study.


2017 ◽  
Vol 17 ◽  
pp. 236-245
Author(s):  
V. V. Kozhevnikov

Today one of the priority problems is receiving an accreditation certificate under the international standard ISO/IEC 17025:2006 by measurement laboratories of Expert service subdivision of the Ministry of Internal Affairs of Ukraine. One of the requirements which is shown to the accredited testing laboratories, is a presence of uncertainty estimation procedure and ability to apply it. As the ballistic researches are one of the important directions of researches which are carried out in the expert subdivisions, therefore the paper is devoted to the consideration ofa question of uncertainty calculation in such measurements. In the mathematical statistics two types of paramètres which characterize dispersion of not correlated random variables are known: a root-mean-square deviation and a confidential interval. As the characteristics of uncertainty they are applied under the title standard and expanded uncertainty. An elementary estimation of measurements result and its uncertainty is carried out in such an order: description of the measured quantity; revealing of uncertainty sources; quantitative description uncertainty constituents (there are estimated uncertainty constituents which can be received a posteriori or a priori); calculation of standard uncertainty of each source, total standard uncertainty and expanded uncertainty. A posterior estimation is possible only in the case of carrying out multiple observations of the measured quantity (standard uncertainty of type A). An a priori estimation is carried out when multiple observations are not performed. In this case it’s necessary to use the information received from the measurements performed before, from the passport data on the facilities ofmeasuring technics orfrom reference books (standard uncertainty of type B). Short consideration of uncertainty concept, elucidation of the basic stages measurements result estimation and its uncertainty gives the chance to transform the theoretical knowledge into practical application of uncertainty estimation on examples of measurements uncertainty calculation during carrying out ballistic ammunition researches by two different ways.


2016 ◽  
Vol 20 (5) ◽  
pp. 1809-1825 ◽  
Author(s):  
Antoine Thiboult ◽  
François Anctil ◽  
Marie-Amélie Boucher

Abstract. Seeking more accuracy and reliability, the hydrometeorological community has developed several tools to decipher the different sources of uncertainty in relevant modeling processes. Among them, the ensemble Kalman filter (EnKF), multimodel approaches and meteorological ensemble forecasting proved to have the capability to improve upon deterministic hydrological forecast. This study aims to untangle the sources of uncertainty by studying the combination of these tools and assessing their respective contribution to the overall forecast quality. Each of these components is able to capture a certain aspect of the total uncertainty and improve the forecast at different stages in the forecasting process by using different means. Their combination outperforms any of the tools used solely. The EnKF is shown to contribute largely to the ensemble accuracy and dispersion, indicating that the initial conditions uncertainty is dominant. However, it fails to maintain the required dispersion throughout the entire forecast horizon and needs to be supported by a multimodel approach to take into account structural uncertainty. Moreover, the multimodel approach contributes to improving the general forecasting performance and prevents this performance from falling into the model selection pitfall since models differ strongly in their ability. Finally, the use of probabilistic meteorological forcing was found to contribute mostly to long lead time reliability. Particular attention needs to be paid to the combination of the tools, especially in the EnKF tuning to avoid overlapping in error deciphering.


2021 ◽  
Author(s):  
Henry Zumbrun ◽  

In the metrology community, there is an ongoing debate over which contributors to the Unit Under Test (UUT) belong in the expanded uncertainty calculation of the measurement process used for calibration. This is also known as Calibration Process Uncertainty (CPU); CPU is the denominator when calculating a Test Uncertainty Ratio (TUR). This paper presents examples that illustrate why the best practices outlined in documents such as ILAC-P14:09/2020 and the ANSI/NCSLI Z540.3 Handbook should be followed regarding the contributors for the CPU. Instead of drafting their own test protocols and standards, calibration laboratories and manufacturers are advised to correctly calculate both uncertainty and risk. Performing these calculations is part of an ethical approach to calibration that avoids shifting more risk to the Industry and ultimately mitigates global consumer's risk. Furthermore, outdated approaches to calculations, such as Test Accuracy Ratio (TAR), must be discontinued, and efforts to change the agreed-upon definition of Test Uncertainty Ratio (TUR) should cease since modern computing can provide measurements that are more accurate and reliable.


2020 ◽  
Vol 21 (3) ◽  
pp. 265-274
Author(s):  
Yu.V. Khomutinin ◽  
◽  
S.E. Levchuk ◽  
V.P. Protsak ◽  
V.O. Kashparov

Standard approaches to the construction of maps of radioactive contamination do not provide errors in map data, so such maps do not, in fact, guarantee the accuracy of the map information. In this paper, based on the fact that the characteristics of radioactive contamination at a particular point in the territory have a lognormal probability distribution, a methodology for creating maps with a guaranteed confidence level of the provided information has been proposed and tested. There are considered two ways of creating maps, based on the results of "direct" measurements of radioactive contamination characteristics and in the combination of "direct" and "indirect" measurements of values statistically related to the mapping characteristic. The approaches and use of kriging methods proposed in the article allow to create maps with a given level of confidence and, accordingly, to take into account the risks caused by the uncertainty of measurements of radioactive contamination characteristics and uncertainty of their approximation.


2019 ◽  
pp. 51-55
Author(s):  
A. Korobko ◽  
O. Nazarko

The article offers a new way of estima-ting the influence of random and methodical errors to the result of measurement for the measurement uncertainty index. The ratio of the difference between theo-retical and experimental data is proposed from the average error of their determination for the quantitative indicator of the influence of methodical error. The ratio of the uncertainty in the measurement of experimental data to the uncertainty in measuring theoretical data for a quantitative measure of the effect of a random error is proposed. These indicators are based on the assumption that the theoretical and experimental data are normally distributed. The theoretical distribution va-ries within the total uncertainty of measurement of type B of the parameter under study. The physical essence of the indicator of the influence of the metho-dical error is the probability with which the results of measuring the average value of the indicator (determined experimentally) are within the li-mits of a possible deviation of the theoretical value of this indicator. Figure — 3. Table 1. References — 14.


2015 ◽  
Vol 12 (7) ◽  
pp. 7179-7223 ◽  
Author(s):  
A. Thiboult ◽  
F. Anctil ◽  
M.-A. Boucher

Abstract. Seeking for more accuracy and reliability, the hydrometeorological community has developed several tools to decipher the different sources of uncertainty in relevant modeling processes. Among them, the Ensemble Kalman Filter, multimodel approaches and meteorological ensemble forecasting proved to have the capability to improve upon deterministic hydrological forecast. This study aims at untangling the sources of uncertainty by studying the combination of these tools and assessing their contribution to the overall forecast quality. Each of these components is able to capture a certain aspect of the total uncertainty and improve the forecast at different stage in the forecasting process by using different means. Their combination outperforms any of the tool used solely. The EnKF is shown to contribute largely to the ensemble accuracy and dispersion, indicating that the initial condition uncertainty is dominant. However, it fails to maintain the required dispersion throughout the entire forecast horizon and needs to be supported by a multimodel approach to take into account structural uncertainty. Moreover, the multimodel approach contributes to improve the general forecasting performance and prevents from falling into the model selection pitfall since models differ strongly in their ability. Finally, the use of probabilistic meteorological forcing was found to contribute mostly to long lead time reliability. Particular attention needs to be paid to the combination of the tools, especially in the Ensemble Kalman Filter tuning to avoid overlapping in error deciphering.


Sign in / Sign up

Export Citation Format

Share Document