Analysis of Quality Control Effect of Reactive Gas Observation Data Based on Multiple Data Quality Control Methods

Author(s):  
Chi Wenxue ◽  
Zhang Yong ◽  
Zhang Xiaochun ◽  
Jing Junshan ◽  
Yan Peng
2016 ◽  
Vol 23 (6) ◽  
pp. 1085-1095 ◽  
Author(s):  
Carlos Sáez ◽  
Oscar Zurriaga ◽  
Jordi Pérez-Panadés ◽  
Inma Melchor ◽  
Montserrat Robles ◽  
...  

Abstract Objective To assess the variability in data distributions among data sources and over time through a case study of a large multisite repository as a systematic approach to data quality (DQ). Materials and Methods Novel probabilistic DQ control methods based on information theory and geometry are applied to the Public Health Mortality Registry of the Region of Valencia, Spain, with 512 143 entries from 2000 to 2012, disaggregated into 24 health departments. The methods provide DQ metrics and exploratory visualizations for (1) assessing the variability among multiple sources and (2) monitoring and exploring changes with time. The methods are suited to big data and multitype, multivariate, and multimodal data. Results The repository was partitioned into 2 probabilistically separated temporal subgroups following a change in the Spanish National Death Certificate in 2009. Punctual temporal anomalies were noticed due to a punctual increment in the missing data, along with outlying and clustered health departments due to differences in populations or in practices. Discussion Changes in protocols, differences in populations, biased practices, or other systematic DQ problems affected data variability. Even if semantic and integration aspects are addressed in data sharing infrastructures, probabilistic variability may still be present. Solutions include fixing or excluding data and analyzing different sites or time periods separately. A systematic approach to assessing temporal and multisite variability is proposed. Conclusion Multisite and temporal variability in data distributions affects DQ, hindering data reuse, and an assessment of such variability should be a part of systematic DQ procedures.


Author(s):  
F. D. Vescovi ◽  
T. Lankester ◽  
E. Coleman ◽  
G. Ottavianelli

The Copernicus Space Component Data Access system (CSCDA) incorporates data contributions from a wide range of satellite missions. Through EO data handling and distribution, CSCDA serves a set of Copernicus Services related to Land, Marine and Atmosphere Monitoring, Emergency Management and Security and Climate Change. <br><br> The quality of the delivered EO products is the responsibility of each contributing mission, and the Copernicus data Quality Control (CQC) service supports and complements such data quality control activities. The mission of the CQC is to provide a service of quality assessment on the provided imagery, to support the investigation related to product quality anomalies, and to guarantee harmonisation and traceability of the quality information. <br><br> In terms of product quality control, the CQC carries out analysis of representative sample products for each contributing mission as well as coordinating data quality investigation related to issues found or raised by Copernicus users. Results from the product analysis are systematically collected and the derived quality reports stored in a searchable database. <br><br> The CQC service can be seen as a privileged focal point with unique comparison capacities over the data providers. The comparison among products from different missions suggests the need for a strong, common effort of harmonisation. Technical terms, definitions, metadata, file formats, processing levels, algorithms, cal/val procedures etc. are far from being homogeneous, and this may generate inconsistencies and confusion among users of EO data. <br><br> The CSCDA CQC team plays a significant role in promoting harmonisation initiatives across the numerous contributing missions, so that a common effort can achieve optimal complementarity and compatibility among the EO data from multiple data providers. This effort is done in coordination with important initiatives already working towards these goals (e.g. INSPIRE directive, CEOS initiatives, OGC standards, QA4EO etc.). <br><br> This paper describes the main actions being undertaken by CQC to encourage harmonisation among space-based EO systems currently in service.


2006 ◽  
Vol 15 (5) ◽  
pp. 497-504 ◽  
Author(s):  
Claudia Golz ◽  
Thomas Einfalt ◽  
Gianmario Galli

Author(s):  
Antonella D. Pontoriero ◽  
Giovanna Nordio ◽  
Rubaida Easmin ◽  
Alessio Giacomel ◽  
Barbara Santangelo ◽  
...  

2001 ◽  
Vol 27 (7) ◽  
pp. 867-876 ◽  
Author(s):  
Pankajakshan Thadathil ◽  
Aravind K Ghosh ◽  
J.S Sarupria ◽  
V.V Gopalakrishna

2014 ◽  
Vol 926-930 ◽  
pp. 4254-4257 ◽  
Author(s):  
Jin Xu ◽  
Da Tao Yu ◽  
Zhong Jie Yuan ◽  
Bo Li ◽  
Zi Zhou Xu

Traditional artificial perception quality control methods of marine environment monitoring data have many disadvantages, including high labor costs and mistakes of data review. Based on GIS spatial analysis technology, Marine Environment Monitoring Data Quality Control System is established according to the Bohai Sea monitoring regulation. In the practical application process, it plays the role of improving efficiency of quality control, saving the manpower and financial resources. It also provides an important guarantee for the comprehensive analysis and management of marine environment data.


Sign in / Sign up

Export Citation Format

Share Document