uncertainty contribution
Recently Published Documents


TOTAL DOCUMENTS

42
(FIVE YEARS 14)

H-INDEX

5
(FIVE YEARS 1)

Author(s):  
Victor Merza ◽  
Christian HRANITZKY ◽  
Andreas STEURER ◽  
Franz Josef MARINGER

Abstract In this article, the proposal of ICRU/ICRP, that the ISO slab phantom should continue to be used as calibration phantom for the new ICRU Report 95 operational quantity personal dose should be legitimized by simulation and performance of experiments to determine backscatter factors on the ISO slab phantom and, in comparison, on an anthropomorphic Alderson Rando phantom. The scope of this work was restricted to the photon energy range of radiation qualities commonly used in X-ray diagnostics. For this purpose, a shadow-free diagnostic (SFD) ionization chamber was used to measure backscatter factors for X radiation in the energy range of 24 keV to 118 keV. The Monte Carlo code MCNP 6.2 was used to validate measurement results on the ISO slab phantom. Additionally, the influence of varying the SFD position on the Rando phantom on the backscatter factor was determined. Since backscatter factors on the ISO slab phantom differ only up to 5 % from those on the Rando phantom, it could be concluded that it is not necessary to develop a new phantom for calibrations in terms of personal dose. A position variation of the detector by few centimeters on the surface of the Rando phantom causes similarly large deviations and thus alone represents an equally large uncertainty contribution in practical personal dosimetry than that arising from the dissimilarity of the real human body to the ISO slab phantom.


2021 ◽  
Vol 12 (7) ◽  
pp. 1529
Author(s):  
Polyakova Aleksandra ◽  
Zavyalov Dmitry ◽  
Kolmakov Vladimir

Sensors ◽  
2021 ◽  
Vol 21 (18) ◽  
pp. 6133
Author(s):  
Alessandro Mingotti ◽  
Federica Costa ◽  
Diego Cavaliere ◽  
Lorenzo Peretto ◽  
Roberto Tinarelli

In recent years, the introduction of real-time simulators (RTS) has changed the way of researching the power network. In particular, researchers and system operators (SOs) are now capable of simulating the complete network and of making it interact with the real world thanks to the hardware-in-the-loop (HIL) and digital twin (DT) concepts. Such tools create infinite scenarios in which the network can be tested and virtually monitored to, for example, predict and avoid faults or energy shortages. Furthermore, the real-time monitoring of the network allows estimating the status of the electrical assets and consequently undertake their predictive maintenance. The success of the HIL and DT application relies on the fact that the simulated network elements (cables, generation, accessories, converters, etc.) are correctly modeled and characterized. This is particularly true if the RTS acquisition capabilities are used to enable the HIL and the DT. To this purpose, this work aims at emphasizing the role of a preliminary characterization of the virtual elements inside the RTS system, experimentally verifying how the overall performance is significantly affected by them. To this purpose, a virtual phasor measurement unit (PMU) is tested and characterized to understand its uncertainty contribution. To achieve that, firstly, the characterization of a virtual PMU calibrator is described. Afterward, the virtual PMU calibration is performed, and the results clearly highlight its key role in the overall uncertainty. It is then possible to conclude that the characterization of the virtual elements, or models, inside RTS systems (omitted most of the time) is fundamental to avoid wrong results. The same concepts can be extended to all those fields that exploit HIL and DT capabilities.


Author(s):  
Matthew T. Spidell ◽  
Anna K. Vaskuri

To calibrate laser power and energy meters, the National Institute of Standards and Technology (NIST) uses several detector-based realizations of the scale for optical radiant flux; these realizations are appropriate for specific laser power/energy ranges and optical coupling configurations. Calibrations from 1 μW to 2 W are currently based upon calorimeters. Validation by comparisons against other primary representations of the optical watt over the last two decades suggests the instruments operate well within their typical reported uncertainty level of 0.86 % with 95 % confidence. The dominant uncertainty contribution in the instrument is attributable to light scattered by the legacy window, which was not previously recognized. The inherent electro-optical inequivalence in the calorimeter’s response was reassessed by thermal modeling to be 0.03 %. The principal contributions to the overall inequivalence were corrected, yielding a shift in scale representation under 0.2 % for typical calibrations. With updates in several uncertainty contributions resulting from this reassessment, the resulting combined expanded uncertainty (k = 2) is 0.84 %, which is essentially unchanged from the previous result provided to calibration customers.


ACTA IMEKO ◽  
2020 ◽  
Vol 9 (5) ◽  
pp. 319
Author(s):  
W. Sabuga ◽  
A. S. Hashad ◽  
S. Ehlers

A 2D flow model is described for calculation of the effective area (<em>A</em>) of pressure-measuring piston-cylinder units (PCU) based on their dimensional properties. With the 2D model, the uncertainty contribution associated with PCU's axial non-symmetry can be eliminated and the uncertainty of <em>A</em> can be reduced. The 2D model is applied to several primary PCUs operated in absolute and gauge pressure modes with different pressure-transmitting media. The benefit of the 2D model in dependence on PCU's geometrical perfectness is discussed.


2020 ◽  
Vol 2020 ◽  
pp. 1-21
Author(s):  
Yizhen Wang ◽  
Menglei Cui ◽  
Jiong Guo ◽  
Jinlin Niu ◽  
Yingjie Wu ◽  
...  

Uncertainty analyses of fission product yields are indispensable in evaluating reactor burnup and decay heat calculation credibility. Compared with neutron cross section, there are fewer uncertainty analyses conducted and it has been a controversial topic by lack of properly estimated covariance matrix as well as adequate sampling method. Specifically, the conventional normal-based sampling method in sampling large uncertainty independent fission yields could inevitably generate nonphysical negative samples. Cutting off these samples would introduce bias into uncertainty results. Here, we evaluate thermal neutron-induced U-235 independent fission yields covariance matrix by the Bayesian updating method, and then we use lognormal-based sampling method to overcome the negative fission yields samples issue. Fission yields uncertainty contribution to effective multiplication factor and several fission products’ atomic densities at equilibrium core of pebble-bed HTGR are quantified and investigated. The results show that the lognormal-based sampling method could prevent generating negative yields samples and maintain the skewness of fission yields distribution. Compared with the zero cut-off normal-based sampling method, the lognormal-based sampling method evaluates the uncertainty of effective multiplication factor and atomic densities are larger. This implies that zero cut-off effect in the normal-based sampling method would underestimate the fission yields uncertainty contribution. Therefore, adopting the lognormal-based sampling method is crucial for providing reliable uncertainty analysis results in fission product yields uncertainty analysis.


2020 ◽  
Author(s):  
Mostafa Tarek ◽  
François Brissette ◽  
Richard Arsenault

&lt;p&gt;&lt;strong&gt;Abstract. &lt;/strong&gt;&lt;/p&gt;&lt;p&gt;Climate change impact studies typically require a reference climatological dataset providing a baseline period to assess future changes.&amp;#160; The reference dataset is also used to perform bias correction of climate model outputs.&amp;#160; Various reliable precipitation datasets are now available over regions with a high-density network of weather stations such as over most parts of Europe and in the United States.&amp;#160; In many of the world&amp;#8217;s regions, the low-density of observation stations (or lack thereof) renders gauge-based precipitation datasets highly uncertain.&amp;#160; Satellite, reanalysis and merged products can be used to overcome this limitation.&amp;#160;&amp;#160; However, each dataset brings additional uncertainty to the reference climate. This study compares ten precipitation datasets over 1091 African catchments to evaluate dataset uncertainty contribution in climate change studies. The precipitation datasets include two gauged-only products (GPCC, CPC), four satellite products (TRMM, CHIRPS, PERSIANN-CDR and TAMSAT) corrected using ground-based observations, three reanalysis products (ERA5, ERA-I, and CFSR) and one merged product of gauge, satellite, and reanalysis (MSWEP).&lt;/p&gt;&lt;p&gt;Each of those datasets was used to assess changes in future streamflows. The climate change impact study used a top-down modelling chain using 10 CMIP5 GCMs under RCP8.5. Each climate projection was bias-corrected and fed to a lumped hydrological model to generate future streamflows over the 2071-2100 period. A variance decomposition was performed to compare GCM uncertainty and reference dataset uncertainty for 51 streamflow metrics over each catchment. Results show that dataset uncertainty is much larger than GCM uncertainty for most of the streamflow metrics and over most of Africa. A selection of the best performing reference datasets (credibility ensemble) significantly reduced the uncertainty attributed to datasets, but remained comparable to that of GCMs in most cases. Results show also relatively small differences between datasets over a reference period can propagate to generate large amounts of uncertainty in the future climate.&amp;#160;&lt;/p&gt;


Sign in / Sign up

Export Citation Format

Share Document