Design and evaluation of statistical control procedures: applications of a computer "quality control simulator" program.

1981 ◽  
Vol 27 (9) ◽  
pp. 1536-1545 ◽  
Author(s):  
J O Westgard ◽  
T Groth

Abstract A computer simulation program has been developed to aid in designing and evaluating statistical control procedures. This "QC stimulator" (quality control) program permits the user to study the effects of different factors on the performance of quality-control procedures. These factors may be properties of the analytical procedure, characteristics of the instrument system, or conditions for the quality-control procedure. The performance of a control procedure is characterized by its probability for rejection, as estimated at several different magnitudes of random and systematic error. These performance characteristics are presented graphically by power functions-plots of the probability for rejection vs the size of the analytical errors. The utility of this stimulation tool is illustrated by application to multi-rule single-value procedures, mean and range procedures, and a trend analysis procedure. Careful choice of control rules is necessary to minimize false rejections and to optimize error detection with multi-rule procedures. Control limits must be carefully calculated for optimum performance of mean and range procedures. The level of significance for testing control must be carefully selected for the trend analysis procedure.

1992 ◽  
Vol 38 (2) ◽  
pp. 204-210 ◽  
Author(s):  
Aristides T Hatjimihail

Abstract I have developed an interactive microcomputer simulation program for the design, comparison, and evaluation of alternative quality-control (QC) procedures. The program estimates the probabilities for rejection under different conditions of random and systematic error when these procedures are used and plots their power function graphs. It also estimates the probabilities for detection of critical errors, the defect rate, and the test yield. To allow a flexible definition of the QC procedures, it includes an interpreter. Various characteristics of the analytical process and the QC procedure can be user-defined. The program extends the concepts of the probability for error detection and of the power function to describe the results of the introduction of error between runs and within a run. The usefulness of this approach is illustrated with some examples.


Author(s):  
Z. Lari ◽  
K. Al-Durgham ◽  
A. Habib

Over the past few years, laser scanning systems have been acknowledged as the leading tools for the collection of high density 3D point cloud over physical surfaces for many different applications. However, no interpretation and scene classification is performed during the acquisition of these datasets. Consequently, the collected data must be processed to extract the required information. The segmentation procedure is usually considered as the fundamental step in information extraction from laser scanning data. So far, various approaches have been developed for the segmentation of 3D laser scanning data. However, none of them is exempted from possible anomalies due to disregarding the internal characteristics of laser scanning data, improper selection of the segmentation thresholds, or other problems during the segmentation procedure. Therefore, quality control procedures are required to evaluate the segmentation outcome and report the frequency of instances of expected problems. A few quality control techniques have been proposed for the evaluation of laser scanning segmentation. These approaches usually require reference data and user intervention for the assessment of segmentation results. In order to resolve these problems, a new quality control procedure is introduced in this paper. This procedure makes hypotheses regarding potential problems that might take place in the segmentation process, detects instances of such problems, quantifies the frequency of these problems, and suggests possible actions to remedy them. The feasibility of the proposed approach is verified through quantitative evaluation of planar and linear/cylindrical segmentation outcome from two recently-developed parameter-domain and spatial-domain segmentation techniques.


1996 ◽  
Vol 42 (3) ◽  
pp. 392-396 ◽  
Author(s):  
E Olafsdottir ◽  
J O Westgard ◽  
S S Ehrmeyer ◽  
K D Fallon

Abstract We have assessed how variation in the matrix of control materials would affect error detection and false-rejection characteristics of quality-control (QC) procedures used to monitor PO2 in blood gas measurements. To determine the expected QC performance, we generated power curves for S(mat)/S(meas) ratios of 0.0-4.0. These curves were used to estimate the probabilities of rejecting analytical runs having medically important errors, calculated from the quality required by the CLIA '88 proficiency testing criterion and the precision and accuracy expected for a typical analytical system. When S(mat)/S(meas) ratios are low, the effects of matrix on QC performance are not serious, permitting selections of QC procedures based on simple power curves for a single component of variation. As S(mat)/S(meas) ratios increase, single-rule procedures generally show a loss in error detection, whereas multirule procedures, including the 3(1)s control rule, show an increase in false rejections. An optimized QC design is presented.


2009 ◽  
Vol 102 (09) ◽  
pp. 593-600 ◽  
Author(s):  
Per Petersen ◽  
Una Sølvik ◽  
Sverre Sandberg ◽  
Anne Stavelin

SummaryMany primary care laboratories use point-of-care (POC) instruments to monitor patients on anticoagulant treatment. The internal analytical quality control of these instruments is often assessed by analysing lyophilised control materials and/or by sending patient samples to a local hospital laboratory for comparison (split sample).The aim of this study was to evaluate the utility of these two models of prothrombin time quality control. The models were evaluated by power functions created by computer simulations based on empirical data from 18 primary care laboratories using the POC instruments Thrombotrack, CoaguChek S, or Hemochron Jr. Signature. The control rules 12S, 13S, exponential weighted moving average, and the deviation limits of ± 10% and ± 20% were evaluated by their probability of error detection and false rejections. The total within-lab coefficient of variation was 3.8% and 6.9% for Thrombotrack, 8.9% and 10.5% for CoaguChek S, and 9.4% and 14.8% for Hemochron Jr. Signature for the control sample measurements and the split sample measurements, respectively. The probability of error detection was higher using a lyophilised control material than a patient split sample for all three instruments, whereas the probability of false rejection was similar. A higher probability of error detection occurred when lyophilised control material was used compared with the patient split samples; therefore, lyophilised control material should be used for internal analytical quality control of prothrombin time in primary health care.


2017 ◽  
Author(s):  
Aristeidis T. Chatzimichail

This doctoral thesis describes a series of tools that have been developed for the design, evaluation, and selection of optimal quality control procedures, in a clinical chemistry laboratory setting. These tools include: 1) A simulation program for the design, evaluation, and comparison of alternative quality control procedures. The program allows (a) the definition of a very large number of quality control rules, and (b) the definition of the quality control procedures as boolean propositions of any degree of complexity. The program elucidates the ways the error is introduced into the measurements and describes the respective methods of simulation. Therefore, it allows the study of the performance of the quality control procedures when (a) there is error in all the measurements, (b) the error is introduced between two consecutive analytical runs, and (c) the error is introduced within an analytical run, between two consecutive control samples. 2) A library of fifty alternative quality control procedures. 3) A library of the power function graphs of these procedures. 4) A program for the selection of the optimal quality control procedure of the library, given an analytical process. As optimal quality control procedure is considered the procedure that detects the critical errors with stated probabilities and the minimum probability for false rejection. A new general system of equations is proposed for the calculation of the critical errors.


1991 ◽  
Vol 37 (10) ◽  
pp. 1720-1724 ◽  
Author(s):  
C A Parvin

Abstract The concepts of the power function for a quality-control rule, the error detection rate, and the false rejection rate were major advances in evaluating the performance characteristics of quality-control procedures. Most early articles published in this area evaluated the performance characteristics of quality-control rules with the assumption that an intermittent error condition occurred only within the current run, as opposed to a persistent error that continued until detection. Difficulties occur when current simulation methods are applied to the persistent error case. Here, I examine these difficulties and propose an alternative method that handles persistent error conditions effectively when evaluating and quantifying the performance characteristics of a quality-control rule.


2020 ◽  
Vol 58 (9) ◽  
pp. 1517-1523
Author(s):  
Martín Yago ◽  
Carolina Pla

AbstractBackgroundStatistical quality control (SQC) procedures generally use rejection limits centered on the stable mean of the results obtained for a control material by the analyzing instrument. However, for instruments with significant bias, re-centering the limits on a different value could improve the control procedures from the viewpoint of patient safety.MethodsA statistical model was used to assess the effect of shifting the rejection limits of the control procedure relative to the instrument mean on the number of erroneous results reported as a result of an increase in the systematic error of the measurement procedure due to an out-of-control condition. The behaviors of control procedures of type 1ks (k = 2, 2.5, 3) were studied when applied to analytical processes with different capabilities (σ = 3, 4, 6).ResultsFor measuring instruments with bias, shifting the rejection limits in the direction opposite to the bias improves the ability of the quality control procedure to limit the risk posed to patients in a systematic out-of-control condition. The maximum benefit is obtained when the displacement is equal to the bias of the instrument, that is, when the rejection limits are centered on the reference mean of the control material. The strategy is sensitive to error in estimating the bias. Shifting the limits more than the instrument’s bias disproportionately increases the risk to patients. This effect should be considered in SQC planning for systems running the same test on multiple instruments.ConclusionsCentering the control rule on the reference mean is a potentially useful strategy for SQC planning based on risk management for measuring instruments with significant and stable uncorrected bias. Low uncertainty in estimating bias is necessary for this approach not to be counterproductive.


Doklady BGUIR ◽  
2021 ◽  
Vol 19 (5) ◽  
pp. 13-20
Author(s):  
D. I. Kazlouski ◽  
Y. V. Titovich ◽  
Y. I. Kazlouskaya

A study of the technical and dosimetry characteristics of brachytherapy afterloaders and applicators was carried out. Ring applicator has been taken as an example, the correctness of positioning of a radiation source (RS) inside the applicators was tested as part of the comissioning and quality control procedure of the brachytherapy applicators. The magnitudes of inconsistencies in the position of RS were established when planning and implementing treatment plans for radiation therapy. The identification of the values of the discrepancy was carried out using the obtained X-ray images of the applicator at the time of the implementation of the irradiation plans. The treatment plan was a sequential positioning of RS in the body of the applicator in each active position with a minimum step from the tip to the vaginal part of the applicator. The X-ray image was obtained by locating the source sequentially at each active position of the applicator. When carrying out dosimetric planning, 3 methods of applicator reconstruction were used. The analysis revealed that the applicator reconstruction method affects the magnitude of the discrepancy in determining the position of the source in the lumen of the applicator ring. Using the methods of statistical analysis, the mean, median, maximum and minimum values of the detected deviations were calculated. The results are presented in the form of tables and graphs for all investigated stop positions of IRS. Based on the results of the study, we consider it expedient to carry out quality control procedures when putting applicators into clinical operation. Based on the results obtained, we consider it acceptable to conduct a quality control procedure for the positioning accuracy of radiation sources in the applicators at least 1 time per month. Taking into account the results of the study when carrying out dosimetric planning will improve the quality of the irradiation sessions using the brachytherapy method, thereby improving the quality of oncological care for the population.


1990 ◽  
Vol 36 (2) ◽  
pp. 230-233 ◽  
Author(s):  
D D Koch ◽  
J J Oryall ◽  
E F Quam ◽  
D H Feldbruegge ◽  
D E Dowd ◽  
...  

Abstract Quality-control (QC) procedures (i.e., decision rules used, numbers of control measurements collected per run) have been selected for individual tests of a multitest analyzer, to see that clinical or "medical usefulness" requirements for quality are met. The approach for designing appropriate QC procedures includes the following steps: (a) defining requirements for quality in the form of the "total allowable analytical error" for each test, (b) determining the imprecision of each measurement procedure, (c) calculating the medically important systematic and random errors for each test, and (d) assessing the probabilities for error detection and false rejection for candidate control procedures. In applying this approach to the Hitachi 737 analyzer, a design objective of 90% (or greater) detection of systematic errors was met for most tests (sodium, potassium, glucose, urea nitrogen, creatinine, phosphorus, uric acid, cholesterol, total protein, total bilirubin, gamma-glutamyltransferase, alkaline phosphatase, aspartate aminotransferase, lactate dehydrogenase) by use of 3.5s control limits with two control measurements per run (N). For the remaining tests (albumin, chloride, total CO2, calcium), requirements for QC procedures were more stringent, and 2.5s limits (with N = 2) were selected.


Author(s):  
Hua Younan

Abstract In wafer fabrication (Fab), Fluorine (F) based gases are used for Al bondpad opening process. Thus, even on a regular Al bondpad, there exists a low level of F contamination. However, the F level has to be controlled at a lower level. If the F level is higher than the control/spec limits, it could cause F-induced corrosion and Al-F defects, resulting in pad discoloration and NSOP problems. In our previous studies [1-5], the theories, characteristics, chemical and physical failure mechanisms and the root causes of the F-induced corrosion and Al-F defects on Al bondpads have been studied. In this paper, we further study F-induced corrosion and propose to establish an Auger monitoring system so as to monitor the F contamination level on Al bondpads in wafer fabrication. Auger monitoring frequency, sample preparation, wafer life, Auger analysis points, control/spec limits and OOC/OOS quality control procedures are also discussed.


Sign in / Sign up

Export Citation Format

Share Document