scholarly journals Sampling Errors and Control of Assay Data Quality in Exploration and Mining Geology

Author(s):  
Marat Abzalov
2021 ◽  
Author(s):  
Jaqueline Driemeyer Correia Horvath ◽  
Marina Bessel ◽  
Natalia Luiza Kops ◽  
Flávia Moreno Alves Souza ◽  
Gerson Fernando Mendes Pereira ◽  
...  

BACKGROUND The credibility of a study and its internal and external validity depend crucially on the quality of the data produced. Quality control aims to monitor sampling errors and measurements during the execution of a study and is based mainly on two pillars: planning and standardization of procedures. OBJECTIVE The present article aimed to describe the stages of quality control in the POP-Brazil study and to present an analysis of the quality indicators. METHODS Quality assurance and control included several phases and processes that were initiated with the planning of the study and continued through the development of the project; thus, all centers were trained in loco. RESULTS The data were through a structured questionnaire and collection of biological samples, both performed by more than 250 trained and certified health professionals. Furthermore, to correct possible inadequacies, all 119 centers (public health units) received at least one monitoring visit, which evaluated the professionals' performance and the process of completing the online data platform. The data were monitored daily and were audited through the double entry of data, performed by the central team. The reliability of data was analyzed through the test-retest method, comparing data from the online platform and a second application of the interview, and conducted through telephone, also by the central team. The agreement between the test and retest was considered good (kappa between 0.59 and 0.74). Large multicenter clinical trials are the basis of medical evidence-based and health-based prevention, so their design, logistics, and quality processes should always be carefully considered. CONCLUSIONS This article presents the processes and quality indicators in the POP-Brazil study that allow other studies to generate reliable data.


Author(s):  
Karolyn Kerr ◽  
Tony Norris

The increasingly information intensive nature of health care demands a proactive and strategic approach to data quality to ensure the right information is available to the right person at the right time in the right format. The approach must also encompass the rights of the patient to have their health data protected and used in an ethical way. This article describes the principles to establish good practice and overcome practical barriers that define and control data quality in health data collections and the mechanisms and frameworks that can be developed to achieve and sustain quality. The experience of a national health data quality project in New Zealand is used to illustrate the issues.


2011 ◽  
pp. 218-225 ◽  
Author(s):  
Karolyn Kerr ◽  
Tony Norris

The increasingly information intensive nature of health care demands a proactive and strategic approach to data quality to ensure the right information is available to the right person at the right time in the right format. The approach must also encompass the rights of the patient to have their health data protected and used in an ethical way. This article describes the principles to establish good practice and overcome practical barriers that define and control data quality in health data collections and the mechanisms and frameworks that can be developed to achieve and sustain quality. The experience of a national health data quality project in New Zealand is used to illustrate the issues.


2021 ◽  
Author(s):  
Ghamdan Gamal Alkholidy ◽  
Labiba Saeed Anam ◽  
Ali Ali Al Mahaqri ◽  
Yousef S. Khader

BACKGROUND The national Severe Acute Respiratory Illness surveillance system in Yemen was established in 2010 to monitor SARI occurrence in humans and provide a foundation for detecting SARI outbreaks. OBJECTIVE To ensure that the objectives of the national surveillance are being met, this study aimed to determine the level of usefulness and assess the performance of the SARI surveillance system in Yemen. METHODS The updated Centers for Disease Control and Prevention (CDC) guidelines were used for the purpose of evaluation. Related documents and reports were reviewed. Data were collected from four central-level managers and stakeholders, and from ten focal points at four Sentinel sites using semi-structured questionnaire. For each attribute, the score percent was calculated and ranked as the following: very poor (≤ 20 %), poor (>20-≤ 40%), average (>40-≤ 60%), good (>60-≤80 %) and excellent (>80 %). RESULTS As rated by the evaluators, SARI surveillance system achieved its targets. The system attributes flexibility (percent score: 86%) and acceptability (percent score: 82%) were rated as excellent, and simplicity (percent score: 74%) and stability (percent score: 75%) were rated as good. The percent score of timeliness was 23% in 2018, indicating poor timeliness. The overall data quality percent score of SARI system was 98.5%. Despite its many strengths, the SARI system has some weakness in that it depends on irregular external financial support. CONCLUSIONS Overall, the SARI surveillance system was useful in estimating morbidity and mortality, monitoring the trend of disease, stimulating researches to inform prevention and control measures. SARI surveillance system was excellent in flexibility, acceptability and data quality. Its simplicity and stability were good; its timeliness was poor. It is recommended to expand the system and engage private sites in SARI surveillance, establish an electronic database at central and peripheral levels, and provide the lab with the reagents needed for confirmation.


2018 ◽  
Author(s):  
Rosela Golloshi ◽  
Jacob Sanders ◽  
Rachel Patton McCord

AbstractThe 3D organization of eukaryotic chromosomes affects key processes such as gene expression, DNA replication, cell division, and response to DNA damage. The genome-wide chromosome conformation capture (Hi-C) approach can characterize the landscape of 3D genome organization by measuring interaction frequencies between all genomic regions. Hi-C protocol improvements and rapid advances in DNA sequencing power have made Hi-C useful to diverse biological systems, not only to elucidate the role of 3D genome structure in proper cellular function, but also to characterize genomic rearrangements, assemble new genomes, and consider chromatin interactions as potential biomarkers for diseases. Yet, the Hi-C protocol is still complex and subject to variations at numerous steps that can affect the resulting data. Thus, there is still a need for better understanding and control of factors that contribute to Hi-C experiment success and data quality. Here, we evaluate recently proposed Hi-C protocol modifications as well as often overlooked variables in sample preparation and examine their effects on Hi-C data quality. We examine artifacts that can occur during Hi-C library preparation, including microhomology-based artificial template copying and chimera formation that can add noise to the downstream data. Exploring the mechanisms underlying Hi-C artifacts pinpoints steps that should be further optimized in the future. To improve the utility of Hi-C in characterizing the 3D genome of specialized populations of cells or small samples of primary tissue, we identify steps prone to DNA loss which should be optimized to adapt Hi-C to lower cell numbers.Highlights3 to 5 bullet points (maximum 85 characters, including spaces, per bullet point)Variability in Hi-C libraries can arise from early steps of cell preparationHi-C 2.0 changes to interaction capture steps also benefit 6-cutter librariesArtificial molecule fusions can arise during end repair and PCR, increasing noiseCommon causes of Hi-C DNA loss identified for future optimization


Sign in / Sign up

Export Citation Format

Share Document