scholarly journals Application of Sigma metrics in the quality control strategies of immunology and protein analytes

Author(s):  
Yanfen Luo ◽  
Xingxing Yan ◽  
Qian Xiao ◽  
Yifei Long ◽  
Jieying Pu ◽  
...  
Water ◽  
2019 ◽  
Vol 11 (5) ◽  
pp. 898 ◽  
Author(s):  
Zhongqing Wei ◽  
Xiangfeng Huang ◽  
Lijun Lu ◽  
Haidong Shangguan ◽  
Zhong Chen ◽  
...  

In view of problems such as the poor control effect of combined sewage pollution caused by traditional intercepting weir and the limited extension of the urban drainage model, which needs a large amount of basic data, this paper not only studied the characteristics of mixed-flow pollution via the urban drainage model but also simulated and optimized 6 interception control strategies and proposed a water quality interception strategy based on the pollution concentration of combined sewage. The results showed that, compared with the traditional interception weir, the interception control strategy of rainwater discharge based on the mixed pipe network model can obviously improve the control rate of various pollutants and reduce the interception amount required for pollution control. Through optimization of the interception based on water quality control by the combination of chemical oxygen demand (COD) and NH4-N, the interception rate was improved by 10.9% to 56.1% in contrast to the traditional interception weir and the closure water volume was reduced by 1432–6154 m3, which effectively improved the reliability and economy of the interception.


2018 ◽  
Vol 12 (4) ◽  
pp. 786-791 ◽  
Author(s):  
Curtis A. Parvin ◽  
Nikola A. Baumann

Background: Current laboratory risk management principles emphasize the importance of assessing laboratory quality control (QC) practices in terms of the risk of patient harm. Limited practical guidance or examples on how to do this are available. Methods: The patient risk model described in a published laboratory risk management guideline was combined with a recently reported approach to computing the predicted probability of patient harm to produce a risk management index (RMI) that compares the predicted probability of patient harm for a QC strategy to the acceptable probability of patient harm based on the expected severity of harm caused by an erroneously reported patient result. Results: Measurement procedure capability and quality control performance for two instruments measuring HbA1c in a laboratory were assessed by computing the RMI for each instrument individually and for the laboratory as a whole. Conclusions: This assessment provides a concrete example of how laboratory QC practices can be directly correlated to the risk of patient harm from erroneously reported patient results.


2011 ◽  
Vol 54 (5) ◽  
pp. 951-957 ◽  
Author(s):  
Susanne P. Boyle ◽  
Paul J. Doolan ◽  
Clare E. Andrews ◽  
Raymond G. Reid

Geophysics ◽  
2017 ◽  
Vol 82 (4) ◽  
pp. B135-B146 ◽  
Author(s):  
Hung Nho Dinh ◽  
Mirko van der Baan ◽  
Martin Landrø

Many vintage reflection seismic surveys exist that have nonrepeated acquisition geometries or final-stacked/migrated sections are obtained using different or nonconsistent processing flows. This may lead to derived time-lapse attributes that are not internally consistent or even conflicting. For instance, we have focused on a subsurface gas blowout that occurred in 1989 in the Norwegian sector of the North Sea. The 2D site survey data were acquired in 1988 and 1990, and the 3D data were acquired in 1991 and 2005. The various sets of legacy data are plagued by poor repeatability among data acquisitions, application of different processing strategies, missing prestack data, and the presence of multiples. All of these factors severely complicate even qualitative interpretation of the gas migration associated with the underground blowout. Careful time-lapse processing may provide useful information even from such challenging legacy data by first computing numerous attributes including instantaneous amplitude differences, time shifts, time-lapse attenuation, and impedance inversions. Then, judicious quality control, invoked by comparing the various attributes, was used to check for internally consistent results.


2021 ◽  
Author(s):  
Jennifer Monereo Sánchez ◽  
Joost J.A. de Jong ◽  
Gerhard S. Drenthen ◽  
Magdalena Beran ◽  
Walter H. Backes ◽  
...  

ABSTRACTBackgroundQuality control of brain segmentation is a fundamental step to ensure data quality. Manual quality control is the current gold standard, despite unfeasible in large neuroimaging samples. Several options for automated quality control have been proposed, providing potential time efficient and reproducible alternatives. However, those have never been compared side to side, which prevents to reach consensus in the appropriate QC strategy to use. This study aims to elucidate the changes manual editing of brain segmentations produce in morphological estimates, and to analyze and compare the effects of different quality control strategies in the reduction of the measurement error.MethodsWe used structural MR images from 259 participants of The Maastricht Study. Morphological estimates were automatically extracted using FreeSurfer 6.0. A subsample of the brain segmentations with inaccuracies was manually edited, and morphological estimates were compared before and after editing. In parallel, 11 quality control strategies were applied to the full sample. Those included: a manual strategy, manual-QC, in which images were visually inspected and manually edited; five automated strategies where outliers were excluded based on the tools MRIQC and Qoala-T, and the metrics morphological global measures, Euler numbers and Contrast-to-Noise ratio; and five semi-automated strategies, were the outliers detected through the mentioned tools and metrics were not excluded, but visually inspected and manually edited. We used a regression of morphological brain measures against age as a test case to compare the changes in relative unexplained variance that each quality control strategy produces, using the reduction of relative unexplained variance as a measure of increase in quality.ResultsManually editing brain surfaces produced changes particularly high in subcortical brain volumes and moderate in cortical surface area, thickness and hippocampal volumes. The exclusion of outliers based on Euler numbers yielded a larger reduction of relative unexplained variance for measurements of cortical area, subcortical volumes and hippocampal subfields, while manual editing of brain segmentations performed best for cortical thickness. MRIQC produced a lower, but consistent for all types of measures, reduction in relative unexplained variance. Unexpectedly, the exclusion of outliers based on global morphological measures produced an increase of relative unexplained variance, potentially removing more morphological information than noise from the sample.ConclusionOverall, the automatic exclusion of outliers based on Euler numbers or MRIQC are reliable and time efficient quality control strategies that can be applied in large neuroimaging cohorts.


Sign in / Sign up

Export Citation Format

Share Document