Automatic Quality Control of Cardiac MRI Segmentation in Large-Scale Population Imaging

Author(s):  
Robert Robinson ◽  
Vanya V. Valindria ◽  
Wenjia Bai ◽  
Hideaki Suzuki ◽  
Paul M. Matthews ◽  
...  
2020 ◽  
Author(s):  
Shirin Moossavi ◽  
Kelsey Fehr ◽  
Theo J. Moraes ◽  
Ehsan Khafipour ◽  
Meghan B. Azad

AbstractBackgroundQuality control including assessment of batch variabilities and confirmation of repeatability and reproducibility are integral component of high throughput omics studies including microbiome research. Batch effects can mask true biological results and/or result in irreproducible conclusions and interpretations. Low biomass samples in microbiome research are prone to reagent contamination; yet, quality control procedures for low biomass samples in large-scale microbiome studies are not well established.ResultsIn this study we have proposed a framework for an in-depth step-by-step approach to address this gap. The framework consists of three independent stages: 1) verification of sequencing accuracy by assessing technical repeatability and reproducibility of the results using mock communities and biological controls; 2) contaminant removal and batch variability correction by applying a two-tier strategy using statistical algorithms (e.g. decontam) followed by comparison of the data structure between batches; and 3) corroborating the repeatability and reproducibility of microbiome composition and downstream statistical analysis. Using this approach on the milk microbiota data from the CHILD Cohort generated in two batches (extracted and sequenced in 2016 and 2019), we were able to identify potential reagent contaminants that were missed with standard algorithms, and substantially reduce contaminant-induced batch variability. Additionally, we confirmed the repeatability and reproducibility of our reslults in each batch before merging them for downstream analysis.ConclusionThis study provides important insight to advance quality control efforts in low biomass microbiome research. Within-study quality control that takes advantage of the data structure (i.e. differential prevalence of contaminants between batches) would enhance the overall reliability and reproducibility of research in this field.


2018 ◽  
Vol 43 ◽  
pp. 129-141 ◽  
Author(s):  
Xènia Albà ◽  
Karim Lekadir ◽  
Marco Pereañez ◽  
Pau Medrano-Gracia ◽  
Alistair A. Young ◽  
...  

Microbiome ◽  
2021 ◽  
Vol 9 (1) ◽  
Author(s):  
Shirin Moossavi ◽  
Kelsey Fehr ◽  
Ehsan Khafipour ◽  
Meghan B. Azad

Abstract Background Quality control including assessment of batch variabilities and confirmation of repeatability and reproducibility are integral component of high throughput omics studies including microbiome research. Batch effects can mask true biological results and/or result in irreproducible conclusions and interpretations. Low biomass samples in microbiome research are prone to reagent contamination; yet, quality control procedures for low biomass samples in large-scale microbiome studies are not well established. Results In this study, we have proposed a framework for an in-depth step-by-step approach to address this gap. The framework consists of three independent stages: (1) verification of sequencing accuracy by assessing technical repeatability and reproducibility of the results using mock communities and biological controls; (2) contaminant removal and batch variability correction by applying a two-tier strategy using statistical algorithms (e.g. decontam) followed by comparison of the data structure between batches; and (3) corroborating the repeatability and reproducibility of microbiome composition and downstream statistical analysis. Using this approach on the milk microbiota data from the CHILD Cohort generated in two batches (extracted and sequenced in 2016 and 2019), we were able to identify potential reagent contaminants that were missed with standard algorithms and substantially reduce contaminant-induced batch variability. Additionally, we confirmed the repeatability and reproducibility of our results in each batch before merging them for downstream analysis. Conclusion This study provides important insight to advance quality control efforts in low biomass microbiome research. Within-study quality control that takes advantage of the data structure (i.e. differential prevalence of contaminants between batches) would enhance the overall reliability and reproducibility of research in this field.


1966 ◽  
Vol 05 (02) ◽  
pp. 67-74 ◽  
Author(s):  
W. I. Lourie ◽  
W. Haenszeland

Quality control of data collected in the United States by the Cancer End Results Program utilizing punchcards prepared by participating registries in accordance with a Uniform Punchcard Code is discussed. Existing arrangements decentralize responsibility for editing and related data processing to the local registries with centralization of tabulating and statistical services in the End Results Section, National Cancer Institute. The most recent deck of punchcards represented over 600,000 cancer patients; approximately 50,000 newly diagnosed cases are added annually.Mechanical editing and inspection of punchcards and field audits are the principal tools for quality control. Mechanical editing of the punchcards includes testing for blank entries and detection of in-admissable or inconsistent codes. Highly improbable codes are subjected to special scrutiny. Field audits include the drawing of a 1-10 percent random sample of punchcards submitted by a registry; the charts are .then reabstracted and recoded by a NCI staff member and differences between the punchcard and the results of independent review are noted.


2021 ◽  
Vol 71 ◽  
pp. 102029
Author(s):  
Evan Hann ◽  
Iulia A. Popescu ◽  
Qiang Zhang ◽  
Ricardo A. Gonzales ◽  
Ahmet Barutçu ◽  
...  

2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Youn Young Park ◽  
Kil‑yong Lee ◽  
Seong Taek Oh ◽  
Sang Hyun Park ◽  
Kyung Do Han ◽  
...  

An amendment to this paper has been published and can be accessed via a link at the top of the paper.


Sign in / Sign up

Export Citation Format

Share Document