scholarly journals Data Quality Control Based on Self-Consistency

2011 ◽  
Vol 139 (12) ◽  
pp. 3974-3991 ◽  
Author(s):  
Reinhold Steinacker ◽  
Dieter Mayer ◽  
Andrea Steiner

Abstract Conducting meteorological measurements, one is always confronted with a wide variety of different types of errors and with the decision of how to correct data for further use, if necessary. The selection of an adequate quality control (QC) procedure out of a wide range of methodologies depends on the properties of the observed parameter such as spatial or temporal consistency. But the intended data application (e.g., model-independent data analysis) or the availability of prior knowledge also has to be taken into account. The herein-presented self-consistent and model-independent QC process makes use of the spatial and temporal consistency of meteorological parameters. It is applicable to measurements featuring a high degree of autocorrelation with regard to the resolution of the observational network in space and time. The presented QC procedure can mathematically be expressed as an optimization problem minimizing the curvature of the analyzed field. This results in a matrix equation that can be solved without needing to converge iterations. Based on the resulting deviations and, if applied, on their impacts on the cost function, station values are accepted, corrected, or identified as outliers and hence dismissed. Furthermore, it is pointed out that this method is able to handle complicated station distributions, such as clustered stations or inhomogeneous station densities. This QC method is not only an appropriate tool for case studies but also for model validation and has been proving itself as a preprocessing tool for operational meso- and micrometeorological analyses.

2016 ◽  
Vol 13 ◽  
pp. 13-19
Author(s):  
Cédric Bertrand ◽  
Luis González Sotelino ◽  
Michel Journée

Abstract. Wind observations are important for a wide range of domains including among others meteorology, agriculture and extreme wind engineering. To ensure the provision of high quality surface wind data over Belgium, a new semi-automated data quality control (QC) has been developed and applied to wind observations from the automated weather stations operated by the Royal Meteorological Institute of Belgium. This new QC applies to 10 m 10 min averaged wind speed and direction, 10 m gust speed and direction, 2 m 10 min averaged wind speed and 30 m 10 min averaged wind speed records. After an existence test, automated procedures check the data for limits consistency, internal consistency, temporal consistency and spatial consistency. At the end of the automated QC, a decision algorithm attributes a flag to each particular data point. Each day, the QC staff analyzes the preceding day's observations in the light of the assigned quality flags.


2019 ◽  
Vol 46 (1) ◽  
pp. 1 ◽  
Author(s):  
Hiroyuki Shimono ◽  
Graham Farquhar ◽  
Matthew Brookhouse ◽  
Florian A. Busch ◽  
Anthony O'Grady ◽  
...  

Elevated atmospheric CO2 concentration (e[CO2]) can stimulate the photosynthesis and productivity of C3 species including food and forest crops. Intraspecific variation in responsiveness to e[CO2] can be exploited to increase productivity under e[CO2]. However, active selection of genotypes to increase productivity under e[CO2] is rarely performed across a wide range of germplasm, because of constraints of space and the cost of CO2 fumigation facilities. If we are to capitalise on recent advances in whole genome sequencing, approaches are required to help overcome these issues of space and cost. Here, we discuss the advantage of applying prescreening as a tool in large genome×e[CO2] experiments, where a surrogate for e[CO2] was used to select cultivars for more detailed analysis under e[CO2] conditions. We discuss why phenotypic prescreening in population-wide screening for e[CO2] responsiveness is necessary, what approaches could be used for prescreening for e[CO2] responsiveness, and how the data can be used to improve genetic selection of high-performing cultivars. We do this within the framework of understanding the strengths and limitations of genotype–phenotype mapping.


2018 ◽  
Vol 36 (5) ◽  
pp. 435-447 ◽  
Author(s):  
Roshan Kuruvila ◽  
S. Thirumalai Kumaran ◽  
M. Adam Khan ◽  
M. Uthayakumar

AbstractThe efficiency of industry depends upon the working conditions of the equipment and components used in the industrial process. The biggest problems faced by the industries are the problems of erosion and corrosion. The harmful effects of corrosion will lead to material loss, which results from the degradation of the equipment. The degradation of the equipment will cause the breakdown of the plant; moreover, it is a threat to the safety of people, and also from the point of conservation, it can cause the exploitation of available resources. The cost of replacing equipment increases the expense, and it can also result in the temporary shutdown of the plant. The protection of surfaces from the adverse effects of corrosion and erosion-corrosion is a matter of great concern in most industrial applications. Advancements in technology provides a wide range of techniques to overcome adverse conditions. The selection of appropriate technology must be from the viewpoint of their interaction with the environment. This review paper addresses the adverse effects of erosion-corrosion in the present scenario.


2020 ◽  
Author(s):  
Aleksey Zhukov ◽  
Vladimir Astashkin ◽  
Vil'en Zholudov ◽  
Vyacheslav Semenov

This monograph summarizes the modern experience of protection of industrial buildings and structures against aggressive impacts are considered characteristic of corrosion processes under the action of liquid, solid and gaseous environments on the main building materials. Provides a system of regulating the degree of aggressiveness for different parts of buildings and constructions basic provisions for the selection of chemically resistant structures and materials, design methodology section corrosion protection. Systematic design methods of protecting groundwater and soil against aggressive and toxic media, the methods of accounting for the cost of corrosion protection as applied to building elements. Designed for a wide range of engineering-technical workers (ITR), related to design, construction and exploitation of constructions and structures. Can also be used as a textbook for technical schools, colleges and training system engineers.


2010 ◽  
Vol 11 (3) ◽  
pp. 666-682 ◽  
Author(s):  
Brian R. Nelson ◽  
D-J. Seo ◽  
Dongsoo Kim

Abstract Temporally consistent high-quality, high-resolution multisensor precipitation reanalysis (MPR) products are needed for a wide range of quantitative climatological and hydroclimatological applications. Therefore, the authors have reengineered the multisensor precipitation estimator (MPE) algorithms of the NWS into the MPR package. Owing to the retrospective nature of the analysis, MPR allows for the utilization of additional rain gauge data, more rigorous automatic quality control, and post factum correction of radar quantitative precipitation estimation (QPE) and optimization of key parameters in multisensor estimation. To evaluate and demonstrate the value of MPR, the authors designed and carried out a set of cross-validation experiments in the pilot domain of North Carolina and South Carolina. The rain gauge data are from the reprocessed Hydrometeorological Automated Data System (HADS) and the daily Cooperative Observer Program (COOP). The radar QPE data are the operationally produced Weather Surveillance Radar-1988 Doppler digital precipitation array (DPA) products. To screen out bad rain gauge data, quality control steps were taken that use rain gauge and radar data. The resulting MPR products are compared with the stage IV product on a daily scale at the withheld COOP gauge locations. This paper describes the data, the MPR procedure, and the validation experiments, and it summarizes the findings.


2011 ◽  
Vol 366 (1580) ◽  
pp. 2979-2986 ◽  
Author(s):  
Ingo Wohlgemuth ◽  
Corinna Pohl ◽  
Joerg Mittelstaet ◽  
Andrey L. Konevega ◽  
Marina V. Rodnina

Speed and accuracy of protein synthesis are fundamental parameters for the fitness of living cells, the quality control of translation, and the evolution of ribosomes. The ribosome developed complex mechanisms that allow for a uniform recognition and selection of any cognate aminoacyl-tRNA (aa-tRNA) and discrimination against any near-cognate aa-tRNA, regardless of the nature or position of the mismatch. This review describes the principles of the selection—kinetic partitioning and induced fit—and discusses the relationship between speed and accuracy of decoding, with a focus on bacterial translation. The translational machinery apparently has evolved towards high speed of translation at the cost of fidelity.


Author(s):  
Katherine Anderson Aur ◽  
Jessica Bobeck ◽  
Anthony Alberti ◽  
Phillip Kay

Abstract Supplementing an existing high-quality seismic monitoring network with openly available station data could improve coverage and decrease magnitudes of completeness; however, this can present challenges when varying levels of data quality exist. Without discerning the quality of openly available data, using it poses significant data management, analysis, and interpretation issues. Incorporating additional stations without properly identifying and mitigating data quality problems can degrade overall monitoring capability. If openly available stations are to be used routinely, a robust, automated data quality assessment for a wide range of quality control (QC) issues is essential. To meet this need, we developed Pycheron, a Python-based library for QC of seismic waveform data. Pycheron was initially based on the Incorporated Research Institutions for Seismology’s Modular Utility for STAtistical kNowledge Gathering but has been expanded to include more functionality. Pycheron can be implemented at the beginning of a data processing pipeline or can process stand-alone data sets. Its objectives are to (1) identify specific QC issues; (2) automatically assess data quality and instrumentation health; (3) serve as a basic service that all data processing builds on by alerting downstream processing algorithms to any quality degradation; and (4) improve our ability to process orders of magnitudes more data through performance optimizations. This article provides an overview of Pycheron, its features, basic workflow, and an example application using a synthetic QC data set.


Author(s):  
F. D. Vescovi ◽  
T. Lankester ◽  
E. Coleman ◽  
G. Ottavianelli

The Copernicus Space Component Data Access system (CSCDA) incorporates data contributions from a wide range of satellite missions. Through EO data handling and distribution, CSCDA serves a set of Copernicus Services related to Land, Marine and Atmosphere Monitoring, Emergency Management and Security and Climate Change. <br><br> The quality of the delivered EO products is the responsibility of each contributing mission, and the Copernicus data Quality Control (CQC) service supports and complements such data quality control activities. The mission of the CQC is to provide a service of quality assessment on the provided imagery, to support the investigation related to product quality anomalies, and to guarantee harmonisation and traceability of the quality information. <br><br> In terms of product quality control, the CQC carries out analysis of representative sample products for each contributing mission as well as coordinating data quality investigation related to issues found or raised by Copernicus users. Results from the product analysis are systematically collected and the derived quality reports stored in a searchable database. <br><br> The CQC service can be seen as a privileged focal point with unique comparison capacities over the data providers. The comparison among products from different missions suggests the need for a strong, common effort of harmonisation. Technical terms, definitions, metadata, file formats, processing levels, algorithms, cal/val procedures etc. are far from being homogeneous, and this may generate inconsistencies and confusion among users of EO data. <br><br> The CSCDA CQC team plays a significant role in promoting harmonisation initiatives across the numerous contributing missions, so that a common effort can achieve optimal complementarity and compatibility among the EO data from multiple data providers. This effort is done in coordination with important initiatives already working towards these goals (e.g. INSPIRE directive, CEOS initiatives, OGC standards, QA4EO etc.). <br><br> This paper describes the main actions being undertaken by CQC to encourage harmonisation among space-based EO systems currently in service.


Author(s):  
Sheikha Mohammed Ali Al-Balushi ◽  
M Firdouse Rahman Khan

Purpose: The objectives of the study are to analyze the factors which influence patients to go to private hospitals against public hospitals of Oman and to analyze the expectations of patients from the integrated public hospitals in Oman.Design/methodology/approach: The study was carried out with a well-defined questionnaire through which 251 survey samples were collected on a random sampling basis.Findings: The results of the study reveal that there is an association between the selection of hospital and services and the cost of the services offered in the hospital and it is found that the cost of services incurred makes an impact in the selection of hospital for medical treatment. The study also revealed that in private hospitals patients could easily approach anyone including the reception staff and all are helpful, and the private hospitals are equipped with modern equipment, and doctors treat patients in a friendly manner.Research limitations/Implications: The majority of the population taken for the study are aged above 20 years, and the samples were collected from selected regions of Oman, and wide range collection of samples from all the regions will help to improve the solution.Social implications: The study suggests that sufficient medicines should be provided in all the public health centers and periodic inspection should be conducted at regular intervals to improve the standards of the public health Centers and Government Hospitals concerning cleanliness, treatments and the front line services.Originality/Value: No study has examined the causes for the hospital selection delay in the construction projects of Oman, and it is a first-hand study of its kind and the results will be useful to the stakeholders.


2020 ◽  
Author(s):  
Manuela Köllner ◽  
Mayumi Wilms ◽  
Anne-Christin Schulz ◽  
Martin Moritz ◽  
Katrin Latarius ◽  
...  

&lt;p&gt;Reliable data are the basis for successful research and scientific publishing. Open data policies assure the availability of publicly financed field measurements to the public, thus to all interested scientists. However, the variety of data sources and the availability or lack of detailed metadata cause a huge effort for each scientist to decide if the data are usable for their own research topic or not. Data end-user communities have different requirements in metadata details and data handling during data processing. For data providing institutes or agencies, these needs are essential to know, if they want to reach a wide range of end-user communities.&lt;/p&gt;&lt;p&gt;The Federal Maritime and Hydrographic Agency (BSH, Bundesamt f&amp;#252;r Seeschifffahrt und Hydrographie, Hamburg, Germany) is collecting a large variety of field data in physical and chemical oceanography, regionally focused on the North Sea, Baltic Sea, and North Atlantic. Data types vary from vertical profiles, time-series, underway measurements as well as real-time or delayed-mode from moored or ship-based instruments. Along other oceanographic data, the BSH provides all physical data via the German Oceanographic Data Center (DOD). It is crucial to aim for a maximum in reliability of the published data to enhance the usage especially in the scientific community.&lt;/p&gt;&lt;p&gt;Here, we present our newly established data processing and quality control procedures using agile project management and workflow techniques, and outline their implementation into metadata and accompanied documentation. To enhance the transparency of data quality control, we will apply a detailed quality flag along with the common data quality flag. This detailed quality flag, established by Mayumi Wilms within the research project RAVE Offshore service (research at alpha ventus) enables data end-users to review the result of several individual quality control checks done during processing and thus to identify easily if the data are usable for their research.&lt;/p&gt;


Sign in / Sign up

Export Citation Format

Share Document