data quality control
Recently Published Documents


TOTAL DOCUMENTS

473
(FIVE YEARS 168)

H-INDEX

26
(FIVE YEARS 4)

2022 ◽  
Vol 12 ◽  
Author(s):  
Anna Bánki ◽  
Martina de Eccher ◽  
Lilith Falschlehner ◽  
Stefanie Hoehl ◽  
Gabriela Markova

Online data collection with infants raises special opportunities and challenges for developmental research. One of the most prevalent methods in infancy research is eye-tracking, which has been widely applied in laboratory settings to assess cognitive development. Technological advances now allow conducting eye-tracking online with various populations, including infants. However, the accuracy and reliability of online infant eye-tracking remain to be comprehensively evaluated. No research to date has directly compared webcam-based and in-lab eye-tracking data from infants, similarly to data from adults. The present study provides a direct comparison of in-lab and webcam-based eye-tracking data from infants who completed an identical looking time paradigm in two different settings (in the laboratory or online at home). We assessed 4-6-month-old infants (n = 38) in an eye-tracking task that measured the detection of audio-visual asynchrony. Webcam-based and in-lab eye-tracking data were compared on eye-tracking and video data quality, infants’ viewing behavior, and experimental effects. Results revealed no differences between the in-lab and online setting in the frequency of technical issues and participant attrition rates. Video data quality was comparable between settings in terms of completeness and brightness, despite lower frame rate and resolution online. Eye-tracking data quality was higher in the laboratory than online, except in case of relative sample loss. Gaze data quantity recorded by eye-tracking was significantly lower than by video in both settings. In valid trials, eye-tracking and video data captured infants’ viewing behavior uniformly, irrespective of setting. Despite the common challenges of infant eye-tracking across experimental settings, our results point toward the necessity to further improve the precision of online eye-tracking with infants. Taken together, online eye-tracking is a promising tool to assess infants’ gaze behavior but requires careful data quality control. The demographic composition of both samples differed from the generic population on caregiver education: our samples comprised caregivers with higher-than-average education levels, challenging the notion that online studies will per se reach more diverse populations.


2022 ◽  
Author(s):  
Chen Li ◽  
Lei Gu ◽  
Zi Yi Li ◽  
Qin Qin Wang ◽  
Hui Ping Zhang ◽  
...  

Proteins analysis from an average cell population often overlooks the cellular heterogeneity of expressed effector molecules, and knowledge about the regulations of key biological processes may remain obscure. Therefore, the necessity of single-cell proteomics (SCP) technologies arises. Without microfluidic chip, expensive ultrasonic equipment, or reformed liquid chromatogram (LC) system, we established an Ultra-sensitive and Easy-to-use multiplexed Single-Cell Proteomic workflow (UE-SCP). Specifically, the flexible sorting system ensured outstanding cell activity, high accuracy, remarkable efficiency, and robustness during single-cell isolation. Multiplex isobaric labeling realized the high-throughput analysis in trapped ion mobility spectrometry coupled with quadrupole time-of-flight mass spectrometry (timsTOF MS). Using this pipeline, we achieved single-cell protein quantities to a depth of over 2,000 protein groups in two human cell lines, Hela and HEK-293T. A small batch experiment can identify and quantify more than 3200 protein groups in 32 single cells, while a large batch experiment can identify and quantify about 4000 protein groups in 96 single cells. All the 128 single cells from different cell lines could been unsupervised clustered based on their proteomes. After the integration of data quality control, data cleaning, and data analysis, we are confident that our UE-SCP platform will be easy-to-marketing popularization and will promote biological applications of single-cell proteomics.


MAUSAM ◽  
2021 ◽  
Vol 63 (1) ◽  
pp. 77-88
Author(s):  
J.K.S. YADAV ◽  
R.K. GIRI ◽  
L.R. MEENA

Global Navigation Satellite System (GNSS) is widely used now days in variety of applications. The observation file for the near realtime estimation of Integrated Precipitable Water Vapour (IPWV) received at the ground-based receiver is mixed with ambiguities. Multi-path effects affect the positional accuracy as well as range from satellite to ground based receiver of the system. The designing of the antenna suppress the effect of multi-path, cycle slips, number of observations, and signal strength and data gaps within the data streams. This paper presents the preliminary data quality control findings of the Patch antenna (LeicaX1202), 3D Choke ring antenna (LeicaAR25 GNSS) and Trimble Zephyr antenna (TRM 39105.00). The results shows that choke ring antenna have least gaps in the data, cycle slips and multi-path effects along with improvement in IPWV. The signal strength and the number of observations are more in case of 3D choke ring antenna.


2021 ◽  
Author(s):  
Francesco Battocchio ◽  
Jaijith Sreekantan ◽  
Arghad Arnaout ◽  
Abed Benaichouche ◽  
Juma Sulaiman Al Shamsi ◽  
...  

Abstract Drilling data quality is notoriously a challenge for any analytics application, due to complexity of the real-time data acquisition system which routinely generates: (i) Time related issues caused by irregular sampling, (ii) Channel related issues in terms of non-uniform names and units, missing or wrong values, and (iii) Depth related issues caused block position resets, and depth compensation (for floating rigs). On the other hand, artificial intelligence drilling applications typically require a consistent stream of high-quality data as an input for their algorithms, as well as for visualization. In this work we present an automated workflow enhanced by data driven techniques that resolves complex quality issues, harmonize sensor drilling data, and report the quality of the dataset to be used for advanced analytics. The approach proposes an automated data quality workflow which formalizes the characteristics, requirements and constraints of sensor data within the context of drilling operations. The workflow leverages machine learning algorithms, statistics, signal processing and rule-based engines for detection of data quality issues including error values, outliers, bias, drifts, noise, and missing values. Further, once data quality issues are classified, they are scored and treated on a context specific basis in order to recover the maximum volume of data while avoiding information loss. This results into a data quality and preparation engine that organizes drilling data for further advanced analytics, and reports the quality of the dataset through key performance indicators. This novel data processing workflow allowed to recover more than 90% of a drilling dataset made of 18 offshore wells, that otherwise could not be used for analytics. This was achieved by resolving specific issues including, resampling timeseries with gaps and different sampling rates, smart imputation of wrong/missing data while preserving consistency of dataset across all channels. Additional improvement would include recovering data values that felt outside a meaningful range because of sensor drifting or depth resets. The present work automates the end-to-end workflow for data quality control of drilling sensor data leveraging advanced Artificial Intelligence (AI) algorithms. It allows to detect and classify patterns of wrong/missing data, and to recover them through a context driven approach that prevents information loss. As a result, the maximum amount of data is recovered for artificial intelligence drilling applications. The workflow also enables optimal time synchronization of different sensors streaming data at different frequencies, within discontinuous time intervals.


Author(s):  
Farkhondeh Asadi ◽  
Nahid Ramezanghorbani ◽  
Sohrab Almasi ◽  
Mehrnaz Hajiabedin Rangraz

Background: Data management related to eye injuries is vital in improving care process, improving treatment and implementing preventive programs. Implementation of a registry to manage data is an integral part of this process. This systematic review aimed to identify processes related to eye injury registries. Methods: Databases such as PubMed, Web of Science, Embase and Scopus were used in searching for articles from 2010 to Oct 2020 using the keywords “eye injuries” and” registry”. The identified processes related to eye injuries registry such as case finding, data collection, abstracting, reporting, follow-up and data quality control are presented in this review. Results: Of 1493 articles retrieved, 30 articles were selected for this study based on the inclusion and exclusion criteria. Majority of these studies were conducted in the United States. All registries had case finding and the most common resources for case finding included medical documents, reports and screening results. Moreover, majority of registries collected data electronically. However, few registries used data quality attributes to improve the data collected. Conclusion: Eye injury registry plays an important role in the management of eye injury data and as a result, better management of these data will be established. Taking into consideration that the quality of collected data has a vital role in adopting prevention strategies, it is essential to use high-quality data and quality control methods in planning and designing eye injury registries.


Sensors ◽  
2021 ◽  
Vol 21 (22) ◽  
pp. 7659
Author(s):  
Mikhail Makarov ◽  
Ilya Aslamov ◽  
Ruslan Gnatovsky

An automatic hydro-meteorological station (AHMS) was designed to monitor the littoral zone of Lake Baikal in areas with high anthropogenic pressure. The developed AHMS was installed near the Bolshiye Koty settlement (southern basin). This AHMS is the first experience focused on obtaining the necessary competencies for the development of a monitoring network of the Baikal natural territory. To increase the flexibility of adjustment and repeatability, we developed AHMS as a low-cost modular system. AHMS is equipped with a weather station and sensors measuring water temperature, pH, dissolved oxygen, redox potential, conductivity, chlorophyll-a, and turbidity. This article describes the main AHMS functions (hardware and software) and measures taken to ensure data quality control. We present the results of the first two periods of its operation. The data acquired during this periods have demonstrated that, to obtain accurate measurements and to detect and correct errors that were mainly due to biofouling of the sensors and calibration bias, a correlation between AHMS and laboratory studies is necessary for parameters such as pH and chlorophyll-a. The gained experience should become the basis for the further development of the monitoring network of the Baikal natural territory.


Sign in / Sign up

Export Citation Format

Share Document