Quality assurance in the laboratory testing process: Indirect estimation of the reference intervals for platelet parameters in neonates

2014 ◽  
Vol 47 (15) ◽  
pp. 33-37 ◽  
Author(s):  
Daniela Stefania Grecu ◽  
Eugenia Paulescu
2021 ◽  
Vol 0 (0) ◽  
Author(s):  
Antje Torge ◽  
Rainer Haeckel ◽  
Mustafa Özcürümez ◽  
Alexander Krebs ◽  
Ralf Junker

Abstract It has been observed that the estimation of reference intervals of leukocytes in whole venous blood leads to higher upper reference limits (uRLs) with indirect methods than has been reported in the literature determined by direct approaches. This phenomenon was reinvestigated with a newer, more advanced indirect method, and could be confirmed. Furthermore, a diurnal variation was observed with lower values during the morning and higher values in the late afternoon and at night. This observation can explain why indirect approaches using samples collected during 24 h lead to higher uRLs than direct methods applied on samples collected presumably in the morning.


1970 ◽  
Vol 2 (2) ◽  
pp. 3-5 ◽  
Author(s):  
Md Tahminur Rahman

DOI: http://dx.doi.org/10.3329/akmmcj.v2i2.8163AKMMC J 2011; 2(2): 03-05


2019 ◽  
Vol 57 (12) ◽  
pp. 1933-1947 ◽  
Author(s):  
Werner Wosniok ◽  
Rainer Haeckel

Abstract All known direct and indirect approaches for the estimation of reference intervals (RIs) have difficulties in processing very skewed data with a high percentage of values at or below the detection limit. A new model for the indirect estimation of RIs is proposed, which can be applied even to extremely skewed data distributions with a relatively high percentage of data at or below the detection limit. Furthermore, it fits better to some simulated data files than other indirect methods. The approach starts with a quantile-quantile plot providing preliminary estimates for the parameters (λ, μ, σ) of the assumed power normal distribution. These are iteratively refined by a truncated minimum chi-square (TMC) estimation. The finally estimated parameters are used to calculate the 95% reference interval. Confidence intervals for the interval limits are calculated by the asymptotic formula for quantiles, and tolerance limits are determined via bootstrapping. If age intervals are given, the procedure is applied per age interval and a spline function describes the age dependency of the reference limits by a continuous function. The approach can be performed in the statistical package R and on the Excel platform.


2014 ◽  
Vol 60 (07/2014) ◽  
Author(s):  
Neda Milinković ◽  
Svetlana Ignjatović ◽  
Miloš Žarković ◽  
Branimir Radosavljević ◽  
Nada Majkić-singh

Author(s):  
P. Paige-Green

The relationships among strength, moisture, and density in pavement subgrades and layerworks are well known, but they have particular significance in low-volume roads. In these roads, the specified density is frequently not achieved (quality assurance testing tends to be reduced), and moisture fluctuations are more severe with greater consequences. Traditional studies of the material strength for these roads in southern Africa are based almost entirely on the California bearing ratio (CBR), a test with inherent problems. The test is routinely carried out to identify whether the materials under consideration have the required soaked strengths (typically CBRs of 80% or 45% for bases of different standards) at the design compaction density. Studies of the CBR at different moisture contents and densities should be carried out to identify the implications of variations in these properties on the behavior of pavement materials. A simple technique to be carried out during conventional laboratory testing was developed. Aspects pertaining to this type of study were evaluated, and the findings were related to low-volume road behavior.


1997 ◽  
Vol 43 (5) ◽  
pp. 908-912 ◽  
Author(s):  
Kenneth E Blick

Abstract Areas other than the analytical process should be the focus of concern about quality issues in the laboratory because nearly 95% of errors occur at the nonanalytical front and back ends of the testing process. Until now, computer systems have been designed to handle the more predictable aspects of laboratory testing, necessitating that the infrequent and unpredictable data events be handled by manual systems. The manual systems are termed “workarounds” and indeed, because they occur sporadically, they are frequently not handled predictably. Here, I describe and give examples of an expert laboratory computer system that can be designed to handle both predictable and unpredictable data events without the use of manual workarounds. This expert system works in concert with a dynamic database allowing such data events to be detected in real time and handled predictably, thus providing a tool to address quality assurance issues throughout the testing process. The system performs up to 31 separate actions or tasks based on data events that in the past were handled by human workarounds.


Sign in / Sign up

Export Citation Format

Share Document