scholarly journals Improving data quality monitoring via a partnership of technologies and resources between the CMS experiment at CERN and industry

2019 ◽  
Vol 214 ◽  
pp. 01007
Author(s):  
Virginia Azzolin ◽  
Michael Andrews ◽  
Gianluca Cerminara ◽  
Nabarun Dev ◽  
Colin Jessop ◽  
...  

The Compact Muon Solenoid (CMS) experiment dedicates significant effort to assess the quality of its data, online and offline. A real-time data quality monitoring system is in place to spot and diagnose problems as promptly as possible to avoid data loss. The a posteriori evaluation of processed data is designed to categorize it in terms of their usability for physics analysis. These activities produce data quality metadata. The data quality evaluation relies on a visual inspection of the monitoring features. This practice has a cost in term of human resources and is naturally subject to human arbitration. Potential limitations are linked to the ability to spot a problem within the overwhelming number of quantities to monitor, or to the lack of understanding of detector evolving conditions. In view of Run 3, CMS aims at integrating deep learning technique in the online workflow to promptly recognize and identify anomalies and improve data quality metadata precision. The CMS experiment engaged in a partnership with IBM with the objective to support, through automatization, the online operations and to generate benchmarking technological results. The research goals, agreed within the CERN Openlab framework, how they matured in a demonstration applic tion and how they are achieved, through a collaborative contribution of technologies and resources, are presented

Author(s):  
Farid Flici ◽  
Nacer-Eddine Hammouda

Mortality in Algeria has declined significantly since the country declared its independence in 1962. This trend has been accompanied by improvements in data quality and changes in estimation methodology, both of which are scarcely documented, and may distort the natural evolution of mortality as reported in official statistics. In this paper, our aim is to detect these methodological and data quality changes by means of the visual inspection of mortality surfaces, which represent the evolution of mortality rates, mortality improvement rates and the male-female mortality ratio over age and time. Data quality problems are clearly visible during the 1977–1982 period. The quality of mortality data has improved after 1983, and even further since the population census of 1998, which coincided with the end of the civil war. Additional inexplicable patterns have also been detected, such as a changing mortality age pattern during the period before 1983, and a changing pattern of excess female mortality at reproductive ages, which suddenly appears in 1983 and disappears in 1992.


2013 ◽  
Vol 318 ◽  
pp. 572-575
Author(s):  
Li Li Yu ◽  
Yu Hong Li ◽  
Ai Feng Wang

In this paper a quality monitoring system for seismic while drilling (SWD) that integrates the whole process of data acquisition was developed. The acquisition equipment, network status and signals of accelerometer and geophone were monitored real-time. With fast signal analysis and quality evaluation, the acquisition parameters and drilling engineering parameters can be adjusted timely. The application of the system can improve the quality of data acquisition and provide subsequent processing and interpretation with high qualified reliable data.


2021 ◽  
Vol 251 ◽  
pp. 04010
Author(s):  
Thomas Britton ◽  
David Lawrence ◽  
Kishansingh Rajput

Data quality monitoring is critical to all experiments impacting the quality of any physics results. Traditionally, this is done through an alarm system, which detects low level faults, leaving higher level monitoring to human crews. Artificial Intelligence is beginning to find its way into scientific applications, but comes with difficulties, relying on the acquisition of new skill sets, either through education or acquisition, in data science. This paper will discuss the development and deployment of the Hydra monitoring system in production at Gluex. It will show how “off-the-shelf” technologies can be rapidly developed, as well as discuss what sociological hurdles must be overcome to successfully deploy such a system. Early results from production running of Hydra will also be shared as well as a future outlook for development of Hydra.


Author(s):  
Bin Lu

This study aims at investigating the difference between attitude towards the construction of quality monitoring system on linguistic landscape of Chinese tourism, and the current situation on regional special linguistic landscape program. By analyzing the degree of participation in serving for improvement the quality of local linguistic landscape, this survey carries out quantitative analysis of attitude research on constructing the benchmark indicators, program management, process control and quality evaluation; explores a sustainable development mode on linguistic landscape assessment for national tourism; promotes the formulation, implementation and promotion of the quality monitoring system on linguistic landscape tourism from 520 feedbacks of respondents. And the objectives of this research were to 1) to investigate the attitudes towards social influence and implementation of series of Standards and Guidance for English Translation and Usage in Public Service(2017-2019); 2) to study the factors that influence different attitudes and opinions; 3) to explore quality evaluation system of linguistic landscape, and promote linguistic landscape evaluation indicators and modes. The conclusion is that the governments should construct the common understanding of program mode and collaborative development on quality monitoring system.


2016 ◽  
Vol 45 (2) ◽  
pp. 3-14 ◽  
Author(s):  
Eva-Maria Asamer ◽  
Franz Astleithner ◽  
Predrag Cetkovic ◽  
Stefan Humer ◽  
Manuela Lenk ◽  
...  

In 2011, Statistics Austria carried out the first register-based census. The use of administrative data for statistical purposes is accompanied by various advantages like a reduced burden for the respondents and less costs for the NSI. However, new challenges, like the quality assessment of this kind of data, arise. Therefore, Statistics Austria developed a comprehensive standardized framework for the evaluation of the data quality for registerbased statistics.In this paper, we present the principle of the quality framework and detailed results from the quality evaluation of the 2011 Austrian census. For each attribute in the census a quality measure is derived from four hyperdimensions. The first three hyperdimensions focus on the documentation of data, the usability of the records and the comparison of data to an external source. The fourth hyperdimension assesses the quality of the imputations. In the framework all the available information on each attribute can be combined to form one final quality indicator. This procedure allows to track changes in quality during data processing and to compare the quality of different census generations.


2015 ◽  
Vol 24 (3) ◽  
pp. 361-369
Author(s):  
Saúl Fagúndez ◽  
Joaquín Fleitas ◽  
Adriana Marotta

AbstractThe use of sensors has had an enormous increment in the last years, becoming a valuable tool in many different areas. In this kind of scenario, the quality of data becomes an extremely important issue; however, not much attention has been paid to this specific topic, with only a few existing works that focus on it. In this paper, we present a proposal for managing data streams from sensors that are installed in patients’ homes in order to monitor their health. It focuses on processing the sensors’ data streams, taking into account data quality. In order to achieve this, a data quality model for this kind of data streams and an architecture for the monitoring system are proposed. Moreover, our work introduces a mechanism for avoiding false alarms generated by data quality problems.


Sign in / Sign up

Export Citation Format

Share Document