calibration program
Recently Published Documents


TOTAL DOCUMENTS

87
(FIVE YEARS 11)

H-INDEX

12
(FIVE YEARS 0)

2022 ◽  
Vol 17 (01) ◽  
pp. C01030
Author(s):  
D. Durnford ◽  
M.-C. Piro

Abstract Bubble chambers using liquid xenon (and liquid argon) have been operated (resp. planned) by the Scintillating Bubble Chamber (SBC) collaboration for GeV-scale dark matter searches and CEνNS from reactors. This will require a robust calibration program of the nucleation efficiency of low-energy nuclear recoils in these target media. Such a program has been carried out by the PICO collaboration, which aims to directly detect dark matter using C3F8 bubble chambers. Neutron calibration data from mono-energetic neutron beam and Am-Be source has been collected and analyzed, leading to a global fit of a generic nucleation efficiency model for carbon and fluorine recoils, at thermodynamic thresholds of 2.45 and 3.29 keV. Fitting the many-dimensional model to the data (34 free parameters) is a non-trivial computational challenge, addressed with a custom Markov Chain Monte Carlo approach, which will be presented. Parametric MC studies undertaken to validate this methodology are also discussed. This fit paradigm demonstrated for the PICO calibration will be applied to existing and future scintillating bubble chamber calibration data.


2021 ◽  
Vol 2136 (1) ◽  
pp. 012004
Author(s):  
Lei Cheng ◽  
Yang Su ◽  
Lifang Wang

Abstract Remote calibration has the property of real-time and remote location, and can be used to control the calibration method for the calibration instrument. It is an effective method to improve the calibration efficiency. Based on the torque sensor calibration method for research, because the remote calibration technology has complex structure, calibration are not allowed to wait for a characteristic, through the remote technology combined with a torque sensor automatic calibration technology, analyze the torque sensor remote calibration technology research, which not only can improve the traditional calibration technology cycle is long, low efficiency of faults. Remote calibration technology based on the basic methods and characteristics are studied, by using the method of torque meter calibration, according to the characteristics of torque sensor, choose corresponding calibration program and method, which can effectively improve the technical problems in the process of system design and analysis of torque sensor remote calibration method research, in addition, through the analysis of electromagnetic interference technology research, which can restrain the interference is proposed; Finally, the remote calibration technology of the torque sensor is effectively improved through the calibration experiment of the system debugging.


2021 ◽  
Vol 97 ◽  
pp. 1-196
Author(s):  
Caroline Wickham-Jones ◽  
Richard Bates ◽  
Alison Cameron ◽  
Ann Clarke ◽  
Diane Collinson ◽  
...  

This volume presents the results of archaeological fieldwork undertaken along the River Dee, Aberdeenshire, north-east Scotland, by the Mesolithic Deeside voluntary community archaeology group between 2017 and 2019. A total of 42 fields were investigated, from which over 11,000 lithics were recovered, representing at least 15 archaeological sites and a span of human activity covering some 10,000 years from around 12,000 BC to c 2000 BC. Finds from the Late Upper Palaeolithic, Mesolithic, Neolithic and Bronze Age were present. Work comprised fieldwalking, test pitting, specialist analysis, and small-scale excavation. The investigation described here is significant not just for the light it throws on the early prehistoric populations along the River Dee but also for the methodology by which investigation was undertaken, as this provides a potential model for work in other areas. Both aspects are covered in the report. The River Dee flows between postglacial gravel and sand terraces, the structure of which has played an important role for the early settlers of the area, and this is covered in some detail in order to provide the physical background framework for the sites. There are also sections on more specialised geophysical and geoscience techniques where these were undertaken, together with a summary of research on the palaeoenvironmental conditions throughout the millennia of prehistory. The artefactual evidence comprises lithic assemblages which were all catalogued as fieldwork progressed; the contents of each site are presented, together with more detailed analysis of the finds from test pitted sites. Finally, given the rich archaeological record from the area, the results of the present project are set into the wider context of the evidence for prehistoric settlement along the river, and there is consideration of future directions for further fieldwork. While all authors have contributed to the whole volume, individual sections that present specialist work by specific teams have been attributed. The distribution maps and GIS are the work of Irvine Ross. Dates given are calibrated BC dates. The Nethermills Farm NM4 dates are calibrated using the Oxford Radiocarbon Accelerator Unit calibration program OxCal 4 (Bronk Ramsey 2009) and their date ranges are calibrated using the IntCal13 atmospheric calibration curve (Reimer et al 2013). Optically stimulated luminescence (OSL) was used to profile sediment accumulations on some of the sites and obtain information relating to site formation, but it was not used for dating in any of the projects.


Water ◽  
2021 ◽  
Vol 13 (21) ◽  
pp. 3061
Author(s):  
Daniel Philippus ◽  
Jordyn M. Wolfand ◽  
Reza Abdi ◽  
Terri S. Hogue

While automatic calibration programs exist for many hydraulic models, no user-friendly and broadly reusable automatic calibration system currently exists for steady-state HEC-RAS models. This study highlights development of Raspy-Cal, an automatic HEC-RAS calibration program based on a genetic algorithm and implemented in Python. It includes a graphical user interface and an interactive command-line interface, as well as libraries readily usable by other programs. As a case study, Raspy-Cal was used to calibrate a model of the Los Angeles River in California and its two major tributaries. We found that Raspy-Cal matched the accuracy of manual calibrations in much less time and without manual intervention, producing a Nash–Sutcliffe Efficiency of 0.89 or greater within several hours when run for 100 iterations. Our analysis showed that the open-source freeware facilitates fast and precise calibration of HEC-RAS models and could serve as a basis for future software development. Raspy-Cal is available online in source and executable form as well as through the Python Package Index.


2021 ◽  
Author(s):  
Mark Kuster

Where practicable, the total end-to-end test-and-calibration program cost would serve as the ultimate measurement quality metric (MQM). Total cost includes both the capitalization and ongoing costs that support product quality (sometimes called cost of quality) and the consequence costs (sometimes called cost of poor quality) that result from imperfect measurement and products. End-to-end means capturing costs from the entire traceability chain: from measurement standards to end products. Minimizing this MQM, total end-toend cost (TETEC), equates to optimizing measurement quality assurance (MQA). Lacking easily available measurement and performance data automatically fed to modeling software, organizations have found cost metrics unimaginable or impracticable, so their measurement programs instead target more easily computed MQMs, such as false-accept risk or simpler proxies thereof, setting minimum, but sub-optimal, quality levels. However, modern computing systems and software, such as laboratory management systems with testpoint- level traceability, rapidly approach the point at which the TETEC MQM will become practicable. Preparing for this eventuality, the NCSLI 173 Metrology Practices Committee has developed models that relate costs to measurement program information such as product specifications, test and measurement uncertainties, calibration intervals and reliability targets. Applications include optimizing overall program MQA, but also estimating the value of metrology and return on equipment investments, selecting instruments, designing test and calibration processes, designing products. This paper applies the cost models to case studies and examples to illustrate some applications.


2021 ◽  
Author(s):  
Collin Delker ◽  

Where practicable, the total end-to-end test-and-calibration program cost would serve as the ultimate measurement quality metric (MQM). Total cost includes both the capitalization and ongoing costs that support product quality (sometimes called cost of quality) and the consequence costs (sometimes called cost of poor quality) that result from imperfect measurement and products. End-to-end means capturing costs from the entire traceability chain: from measurement standards to end products. Minimizing this MQM, total end-toend cost (TETEC), equates to optimizing measurement quality assurance (MQA). Lacking easily available measurement and performance data automatically fed to modeling software, organizations have found cost metrics unimaginable or impracticable, so their measurement programs instead target more easily computed MQMs, such as false-accept risk or simpler proxies thereof, setting minimum, but sub-optimal, quality levels. However, modern computing systems and software, such as laboratory management systems with testpoint- level traceability, rapidly approach the point at which the TETEC MQM will become practicable. Preparing for this eventuality, the NCSLI 173 Metrology Practices Committee has developed models that relate costs to measurement program information such as product specifications, test and measurement uncertainties, calibration intervals and reliability targets. Applications include optimizing overall program MQA, but also estimating the value of metrology and return on equipment investments, selecting instruments, designing test and calibration processes, designing products. This paper applies the cost models to case studies and examples to illustrate some applications.


Author(s):  
Rahmatullah Sediqi ◽  
Mustafa Tombul

The Soil and Water Assessment Tool (SWAT), a semi-distributed physically-based hydrological model, is broadly used for simulating streamflow and analyzing hydrological processes in the basin. The SWAT model was applied to analyze the hydrological processes in Göksu Himmetli, Zamanti-Ergenuşağı, Göksun Poskoflu ve Hurman-Gözler Üstü sub-basins in the upper region of Seyhan and Ceyhan watersheds located in the south of Turkey. Model sensitivity analysis, calibration, and validation were performed using SWAT-CUP automatic calibration program and SUFI-2 algorithm. According to the model sensitivity analysis results, the most sensitive parameters in these basins have been seen as CN2, ALPHA_BNK, CH_K2, and GW_DELAY. In this study, 11 years (1994-2004) meteorological and eight years (1997-2004) observed flow data were used, three years for the model warm-up period, five years (1997-2001) for calibration, and three years (2002-2004) for validation. The model statistical performance was evaluated using the Nash Sutcliffe Efficiency (NSE) as the objective function. As the result of the model calibration and validation, the NSE value in the considered four sub-basins varied between 0,70 - 0,90. The results obtained in the study showed a relatively high correlation between the observed and simulated discharge data.


Author(s):  
I.V. Chechushkov ◽  
A.V. Epimakhov

By means of the Bayesian analysis of radiocarbon dates, a comparison of chronologies of the Kamennyi Ambar settlement and the cemetery of Kamennyi Ambar-5 of the Late Bronze Age Syntashta-Petrovka period has been carried out. Both sites are situated in the valley of the Karagaily-Ayat River in Kartalinsky district of Chelyab-insk Region (Russia). Comparison of the pottery assemblages of the settlement and the cemetery demonstrates their similarity, which suggests existence of a genetic link between the sites. The purpose of this work is devel-opment of a generalized chronological model of the two monuments. This is achieved by comparison of uncali-brated intervals of radiocarbon dates and calculation of chronological boundaries of the existence of the settle-ment and cemetery by means of Bayesian modeling of the calibrated dates. The method consists in that, in the beginning, the stratigraphic position of each date is determined, and then the dates suitable for the analysis are arranged in the chronological order and calibrated, while the algorithm of the OxCal 4.4 calibration program is queried for calculation of the boundaries of the given periods and their duration. Also, the paper reports complete sets of the radiocarbon dates: 61 dates have been obtained from the materials of the settlement of Kamennyi Ambar, while 19 measurements originate from the Kamennyi Ambar-5 cemetery. Correlation of the radiocarbon dates and development of the Bayesian chronological models have demonstrated contemporaneousness of the settlement and the cemetery with slightly later beginning of the activity at the latter. This observation is in agree-ment with the concept of the genetic link between the sites and, arguably, can be extended onto other pairs of fortified settlement — kurgan cemetery attributed to the Sintashta-Petrovka period. Our conclusion is also consis-tent with the concept of building the complex of monuments by a newly-arrived population, who founded a settle-ment, occupied the new territory for some time, while the first deaths occurred some time afterwards. That said, the settlement of Kamennyi Ambar existed for no longer than a century in the 1950s — 1860s BC, while the cemetery of Kamennyi Ambar-5 was used for 70–80 years within the same chronological interval.


2021 ◽  
Vol 2021 (3) ◽  
Author(s):  
Angel Abusleme ◽  
◽  
Thomas Adam ◽  
Shakeel Ahmad ◽  
Rizwan Ahmed ◽  
...  

Abstract We present the calibration strategy for the 20 kton liquid scintillator central detector of the Jiangmen Underground Neutrino Observatory (JUNO). By utilizing a comprehensive multiple-source and multiple-positional calibration program, in combination with a novel dual calorimetry technique exploiting two independent photosensors and readout systems, we demonstrate that the JUNO central detector can achieve a better than 1% energy linearity and a 3% effective energy resolution, required by the neutrino mass ordering determination.


2021 ◽  
Author(s):  
Maarten Soudijn ◽  
Sebastiaan van Rossum ◽  
Ane de Boer

<p>In this paper we present weight measurements of urban heavy traffic comparing two different Weigh In Motion (WIM) systems. One is a WIM-ROAD system using Lineas quartz pressure sensors in the road surface. The other is a WIM-BRIDGE system using optical fibre-based strain sensors which are applied under the bridge to the bottom fibre of a single span of the bridge deck. We have designed our tests to determine which system is most suited to Amsterdam. We put special focus on the accuracy that each system can achieve and have set up an extensive calibration program to determine this. Our ultimate goal is to draw up a realistic traffic load model for Amsterdam. This model would lead to a recommendation that can be used to re- examine the structural safety of existing historic bridges and quay walls, in addition to the current traffic load recommendations.</p>


Sign in / Sign up

Export Citation Format

Share Document