scholarly journals Expanding HadISD: quality-controlled, sub-daily station data from 1931

2016 ◽  
Author(s):  
Robert J. H. Dunn ◽  
Kate M. Willett ◽  
David E. Parker ◽  
Lorna Mitchell

Abstract. HadISD is a sub-daily, station-based, quality-controlled dataset designed to study past extremes of temperature, pressure and humidity and allow comparisons to future projections. Herein we describe the first major update to the HadISD dataset. The temporal coverage of the dataset has been extended to 1931 to present, doubling the time range over which data are provided. Improvements made to the station selection and merging procedures result in 7677 stations being provided in version 2.0.0.2015p of this dataset. The selection of stations to merge together making composites has also been improved and made more robust. The underlying structure of the quality control procedure is the same as for HadISD.1.0.x, but a number of improvements have been implemented in individual tests. Also, more detailed quality control tests for wind speed and direction have been added. The data will be made available as netCDF files at www.metoffice.gov.uk/hadobs/hadisd and updated annually.


2016 ◽  
Vol 5 (2) ◽  
pp. 473-491 ◽  
Author(s):  
Robert J. H. Dunn ◽  
Kate M. Willett ◽  
David E. Parker ◽  
Lorna Mitchell

Abstract. HadISD is a sub-daily, station-based, quality-controlled dataset designed to study past extremes of temperature, pressure and humidity and allow comparisons to future projections. Herein we describe the first major update to the HadISD dataset. The temporal coverage of the dataset has been extended to 1931 to present, doubling the time range over which data are provided. Improvements made to the station selection and merging procedures result in 7677 stations being provided in version 2.0.0.2015p of this dataset. The selection of stations to merge together making composites has also been improved and made more robust. The underlying structure of the quality control procedure is the same as for HadISD.1.0.x, but a number of improvements have been implemented in individual tests. Also, more detailed quality control tests for wind speed and direction have been added. The data will be made available as NetCDF files at http://www.metoffice.gov.uk/hadobs/hadisd and updated annually.



2015 ◽  
Vol 11 (5) ◽  
pp. 4569-4600 ◽  
Author(s):  
R. J. H. Dunn ◽  
K. M. Willett ◽  
D. E. Parker ◽  
L. Mitchell

Abstract. We describe the first major update to the sub-daily station-based HadISD dataset. The temporal coverage of the dataset has been extended to 1931 to present, doubling the time range over which data are provided. Improvements made to the station selection and merging procedures result in 8113 stations being provided in version 2.0.0.2014f of this dataset. This station selection will be reassessed at every annual update, which is likely to result in increasing station numbers over time. The selection of stations to merge together making composites has also been improved and made more robust. The underlying structure of the quality control procedure is the same as for HadISD.1.0.x, but a number of improvements have been implemented in individual tests. Also, more detailed quality control tests for wind speed and direction have been added. The data will be made available as netCDF files at www.metoffice.gov.uk/hadobs/hadisd and updated annually.



Author(s):  
Z. Lari ◽  
K. Al-Durgham ◽  
A. Habib

Over the past few years, laser scanning systems have been acknowledged as the leading tools for the collection of high density 3D point cloud over physical surfaces for many different applications. However, no interpretation and scene classification is performed during the acquisition of these datasets. Consequently, the collected data must be processed to extract the required information. The segmentation procedure is usually considered as the fundamental step in information extraction from laser scanning data. So far, various approaches have been developed for the segmentation of 3D laser scanning data. However, none of them is exempted from possible anomalies due to disregarding the internal characteristics of laser scanning data, improper selection of the segmentation thresholds, or other problems during the segmentation procedure. Therefore, quality control procedures are required to evaluate the segmentation outcome and report the frequency of instances of expected problems. A few quality control techniques have been proposed for the evaluation of laser scanning segmentation. These approaches usually require reference data and user intervention for the assessment of segmentation results. In order to resolve these problems, a new quality control procedure is introduced in this paper. This procedure makes hypotheses regarding potential problems that might take place in the segmentation process, detects instances of such problems, quantifies the frequency of these problems, and suggests possible actions to remedy them. The feasibility of the proposed approach is verified through quantitative evaluation of planar and linear/cylindrical segmentation outcome from two recently-developed parameter-domain and spatial-domain segmentation techniques.



2017 ◽  
Author(s):  
Aristeidis T. Chatzimichail

This doctoral thesis describes a series of tools that have been developed for the design, evaluation, and selection of optimal quality control procedures, in a clinical chemistry laboratory setting. These tools include: 1) A simulation program for the design, evaluation, and comparison of alternative quality control procedures. The program allows (a) the definition of a very large number of quality control rules, and (b) the definition of the quality control procedures as boolean propositions of any degree of complexity. The program elucidates the ways the error is introduced into the measurements and describes the respective methods of simulation. Therefore, it allows the study of the performance of the quality control procedures when (a) there is error in all the measurements, (b) the error is introduced between two consecutive analytical runs, and (c) the error is introduced within an analytical run, between two consecutive control samples. 2) A library of fifty alternative quality control procedures. 3) A library of the power function graphs of these procedures. 4) A program for the selection of the optimal quality control procedure of the library, given an analytical process. As optimal quality control procedure is considered the procedure that detects the critical errors with stated probabilities and the minimum probability for false rejection. A new general system of equations is proposed for the calculation of the critical errors.



2018 ◽  
Vol 77 (OCE3) ◽  
Author(s):  
S. Cassidy ◽  
B. Phillips ◽  
J. Caldeira Fernandes da Silva ◽  
A. Parle


2013 ◽  
Vol 25 (1) ◽  
pp. 21-32
Author(s):  
Jun Mao ◽  
David R. McDonald ◽  
Mahmoud Zarepour


2015 ◽  
Vol 54 (6) ◽  
pp. 1267-1282 ◽  
Author(s):  
Youlong Xia ◽  
Trent W. Ford ◽  
Yihua Wu ◽  
Steven M. Quiring ◽  
Michael B. Ek

AbstractThe North American Soil Moisture Database (NASMD) was initiated in 2011 to provide support for developing climate forecasting tools, calibrating land surface models, and validating satellite-derived soil moisture algorithms. The NASMD has collected data from over 30 soil moisture observation networks providing millions of in situ soil moisture observations in all 50 states, as well as Canada and Mexico. It is recognized that the quality of measured soil moisture in NASMD is highly variable because of the diversity of climatological conditions, land cover, soil texture, and topographies of the stations, and differences in measurement devices (e.g., sensors) and installation. It is also recognized that error, inaccuracy, and imprecision in the data can have significant impacts on practical operations and scientific studies. Therefore, developing an appropriate quality control procedure is essential to ensure that the data are of the best quality. In this study, an automated quality control approach is developed using the North American Land Data Assimilation System, phase 2 (NLDAS-2), Noah soil porosity, soil temperature, and fraction of liquid and total soil moisture to flag erroneous and/or spurious measurements. Overall results show that this approach is able to flag unreasonable values when the soil is partially frozen. A validation example using NLDAS-2 multiple model soil moisture products at the 20-cm soil layer showed that the quality control procedure had a significant positive impact in Alabama, North Carolina, and west Texas. It had a greater impact in colder regions, particularly during spring and autumn. Over 433 NASMD stations have been quality controlled using the methodology proposed in this study, and the algorithm will be implemented to control data quality from the other ~1200 NASMD stations in the near future.



RADIOISOTOPES ◽  
1992 ◽  
Vol 41 (3) ◽  
pp. i-xvi
Author(s):  
Subcommittee for Standardization of


2007 ◽  
Vol 34 (6Part5) ◽  
pp. 2373-2373
Author(s):  
D Beylin ◽  
S Sherry ◽  
E Anashkin ◽  
D Narayanan ◽  
P Stepanov ◽  
...  


Sign in / Sign up

Export Citation Format

Share Document