scholarly journals DAQUA-MASS: An ISO 8000-61 Based Data Quality Management Methodology for Sensor Data

Sensors ◽  
2018 ◽  
Vol 18 (9) ◽  
pp. 3105 ◽  
Author(s):  
Ricardo Perez-Castillo ◽  
Ana Carretero ◽  
Ismael Caballero ◽  
Moises Rodriguez ◽  
Mario Piattini ◽  
...  

The Internet-of-Things (IoT) introduces several technical and managerial challenges when it comes to the use of data generated and exchanged by and between various Smart, Connected Products (SCPs) that are part of an IoT system (i.e., physical, intelligent devices with sensors and actuators). Added to the volume and the heterogeneous exchange and consumption of data, it is paramount to assure that data quality levels are maintained in every step of the data chain/lifecycle. Otherwise, the system may fail to meet its expected function. While Data Quality (DQ) is a mature field, existing solutions are highly heterogeneous. Therefore, we propose that companies, developers and vendors should align their data quality management mechanisms and artefacts with well-known best practices and standards, as for example, those provided by ISO 8000-61. This standard enables a process-approach to data quality management, overcoming the difficulties of isolated data quality activities. This paper introduces DAQUA-MASS, a methodology based on ISO 8000-61 for data quality management in sensor networks. The methodology consists of four steps according to the Plan-Do-Check-Act cycle by Deming.

Sensors ◽  
2021 ◽  
Vol 21 (17) ◽  
pp. 5834
Author(s):  
Lina Zhang ◽  
Dongwon Jeong ◽  
Sukhoon Lee

Nowadays, IoT is being used in more and more application areas and the importance of IoT data quality is widely recognized by practitioners and researchers. The requirements for data and its quality vary from application to application or organization in different contexts. Many methodologies and frameworks include techniques for defining, assessing, and improving data quality. However, due to the diversity of requirements, it can be a challenge to choose the appropriate technique for the IoT system. This paper surveys data quality frameworks and methodologies for IoT data, and related international standards, comparing them in terms of data types, data quality definitions, dimensions and metrics, and the choice of assessment dimensions. The survey is intended to help narrow down the possible choices of IoT data quality management technique.


Author(s):  
Suranga C. H. Geekiyanage ◽  
Dan Sui ◽  
Bernt S. Aadnoy

Drilling industry operations heavily depend on digital information. Data analysis is a process of acquiring, transforming, interpreting, modelling, displaying and storing data with an aim of extracting useful information, so that the decision-making, actions executing, events detecting and incident managing of a system can be handled in an efficient and certain manner. This paper aims to provide an approach to understand, cleanse, improve and interpret the post-well or realtime data to preserve or enhance data features, like accuracy, consistency, reliability and validity. Data quality management is a process with three major phases. Phase I is an evaluation of pre-data quality to identify data issues such as missing or incomplete data, non-standard or invalid data and redundant data etc. Phase II is an implementation of different data quality managing practices such as filtering, data assimilation, and data reconciliation to improve data accuracy and discover useful information. The third and final phase is a post-data quality evaluation, which is conducted to assure data quality and enhance the system performance. In this study, a laboratory-scale drilling rig with a control system capable of drilling is utilized for data acquisition and quality improvement. Safe and efficient performance of such control system heavily relies on quality of the data obtained while drilling and its sufficient availability. Pump pressure, top-drive rotational speed, weight on bit, drill string torque and bit depth are available measurements. The data analysis is challenged by issues such as corruption of data due to noises, time delays, missing or incomplete data and external disturbances. In order to solve such issues, different data quality improvement practices are applied for the testing. These techniques help the intelligent system to achieve better decision-making and quicker fault detection. The study from the laboratory-scale drilling rig clearly demonstrates the need for a proper data quality management process and clear understanding of signal processing methods to carry out an intelligent digitalization in oil and gas industry.


Author(s):  
Tarik Chafiq ◽  
Mohammed Ouadoud ◽  
Hassane Jarar Oulidi ◽  
Ahmed Fekri

The aim of this research work is to ensure the integrity and correction of the geotechnical database which contains anomalies. These anomalies occurred mainly in the phase of inputting and/or transferring of data. The algorithm created in the framework of this paper was tested on a dataset of 70 core drillings. In fact, it is based on a multi-criteria analysis qualifying the geotechnical data integrity using the sequential approach. The implementation of this algorithm has given a relevant set of values in terms of output; which will minimalize processing time and manual verification. The application of the methodology used in this paper could be useful to define the type of foundation adapted to the nature of the subsoil, and thus, foresee the adequate budget.


Sign in / Sign up

Export Citation Format

Share Document