On the calculation of a "reference change" for comparing two consecutive measurements.

1983 ◽  
Vol 29 (1) ◽  
pp. 25-30 ◽  
Author(s):  
E K Harris ◽  
T Yasaka

Abstract We describe a statistical method for calculating a "reference change," defined as that difference between two consecutive test results in an individual that is statistically significant in a given proportion of all similar persons. By allowing for variation in within-person variances, this procedure computes a reference change that is more specific (i.e., less prone to false positives) than that obtained directly from the distribution of observed differences between measurements. Moreover, the method may easily be extended to a test for trend in three successive measurements. The method has been applied to semi-annual measurements of serum calcium and alkaline phosphatase in 698 men and women enrolled in a large health-maintenance program. We believe that these ideas may also be usefully applied to successive laboratory tests in carefully defined patient populations--but this introduces special problems, which are discussed briefly.

2020 ◽  
Vol 3 (1) ◽  
pp. 87-89
Author(s):  
David IMB

Some years ago – let’s say, for sure, some decades ago – a “good physician” was considered the one who would request a few laboratory tests to some patient and they would come out all altered. – Yes – one would say – he got it right to the point, disease concerning. However, in the 21st century, when we have moved the focus of preventive care from prevention of diseases to health maintenance, we believe that the “good physician” may be considered the one that aims at keeping all the test results basically normal. Actually, more than that: optimal. In fact, keeping the eyes on optimal test results is a major goal for Age Management practitioners. For that, even if we still have the established reference ranges, we’ve got to have our own interpretation of the tests we are used to from a different, more detailed standpoint than in the general practice. Some normal results must be seen as subnormal, which are, indeed, still different from abnormal. Some results may also be considered critical – that is, those ones which demand some urgent and immediate action.


ORL ◽  
1983 ◽  
Vol 45 (4) ◽  
pp. 203-215 ◽  
Author(s):  
Toshio Imoto ◽  
Yoshiaki Nakai

2021 ◽  
Author(s):  
Camilo E. Valderrama ◽  
Daniel J. Niven ◽  
Henry T. Stelfox ◽  
Joon Lee

BACKGROUND Redundancy in laboratory blood tests is common in intensive care units (ICU), affecting patients' health and increasing healthcare expenses. Medical communities have made recommendations to order laboratory tests more judiciously. Wise selection can rely on modern data-driven approaches that have been shown to help identify redundant laboratory blood tests in ICUs. However, most of these works have been developed for highly selected clinical conditions such as gastrointestinal bleeding. Moreover, features based on conditional entropy and conditional probability distribution have not been used to inform the need for performing a new test. OBJECTIVE We aimed to address the limitations of previous works by adapting conditional entropy and conditional probability to extract features to predict abnormal laboratory blood test results. METHODS We used an ICU dataset collected across Alberta, Canada which included 55,689 ICU admissions from 48,672 patients with different diagnoses. We investigated conditional entropy and conditional probability-based features by comparing the performances of two machine learning approaches to predict normal and abnormal results for 18 blood laboratory tests. Approach 1 used patients' vitals, age, sex, admission diagnosis, and other laboratory blood test results as features. Approach 2 used the same features plus the new conditional entropy and conditional probability-based features. RESULTS Across the 18 blood laboratory tests, both Approach 1 and Approach 2 achieved a median F1-score, AUC, precision-recall AUC, and Gmean above 80%. We found that the inclusion of the new features statistically significantly improved the capacity to predict abnormal laboratory blood test results in between ten and fifteen laboratory blood tests depending on the machine learning model. CONCLUSIONS Our novel approach with promising prediction results can help reduce over-testing in ICUs, as well as risks for patients and healthcare systems. CLINICALTRIAL N/A


2020 ◽  
Author(s):  
Sabe Mwape ◽  
Victor Daka ◽  
Scott Matafwali ◽  
Kapambwe Mwape ◽  
Jay Sikalima ◽  
...  

Background Medical laboratory diagnosis is a critical component of patient management in the healthcare setup. Despite the availability of laboratory tests, clinicians may not utilise them to make clinical decisions. We investigated utilsation of laboratory tests for patient management among clinicians at Ndola Teaching Hospital (NTH) and Arthur Davison Childrens Hospital (ADCH), two large referral hospitals in the Copperbelt Province, Ndola, Zambia. Method We conducted a descriptive cross-sectional study among clinicians. The study deployed self-administered questionnaires to evaluate clinician utilisation, querying and confidence in laboratory results. Additional data on demographics and possible laboratory improvements were also obtained. Data were entered in Microsoft excel and exported to SPSS version 16 for statistical analysis. Results Of the 80 clinicians interviewed, 96.2% (77) reported using laboratory tests and their results in patient management. 77.5% (62) of the clinicians indicated they always used laboratory results to influence their patient management decisions. Of the selected laboratory tests, clinicians were more confident in using haemoglobin test results (91.2%). There was no statistically significant association between the clinicians gender or qualification and use of test results in patient management. Conclusion Our findings show that despite the majority querying laboratory results, most of the clinicians use laboratory results for patient management. There is need for interactions between the laboratory and clinical area to assure clinician confidence in laboratory results. Key words: utilisation, clinicians, laboratory tests, Ndola Teaching Hospital, Arthur Davison Childrens Hospital


Author(s):  
Bijender Kumar Bairwa ◽  
Mamta Sagar ◽  
R. C. Gupta ◽  
Madhuri Gupta

Background: This study was undertaken to investigate the changes in salivary and serum calcium and alkaline phosphatase in osteoporosis patients. The objective was to compare the change in serum levels with those in saliva.Methods: The study was conducted in the department of biochemistry, National Institute of Medical Sciences and Hospital, Shobha Nagar, Jaipur, Rajasthan, India. Subjects were selected from department of orthopedics, National Institute of Medical Sciences and Hospital, Shobha Nagar, Jaipur, Rajasthan, India. At the same time one hundred adult osteoporosis patients confirmed by DEXA were taken. Calcium and alkaline phosphatase were measured in serum and saliva of each patient. The data obtained was statistically analyzed.Results: Serum calcium has strong positive correlation with salivary calcium (r=0.726) while serum ALP and salivary ALP had weak positive correlation (r =0.453).Conclusions: Saliva can be used to measure calcium level instead of serum as it is non-invasive, quick and easy method.


2013 ◽  
Vol 12 (3) ◽  
pp. 193-200
Author(s):  
Izabela Skrzypczak

The use of cores is an integral part of the assessment of existing structures that are modernized, redesigned or have been damaged. Evaluation of the test results and the estimation of characteristic values of compressive strength can be performed according to the statistical method proposed by the Annex D of the PN-EN 1990 [1] standard, and  also according to the PN-EN 13791 [2]. The procedures recommended in these both documents are different which can lead to various assessments of the characteristic values. The author has been verified whether the empirical relationships, defined in PN-EN 13791 [2], lead to obtaining larger values of characteristic strength and, consequently, to  estimation at the unsafe region. The characteristic compressive strength was of determined in accordance with the recommendations of  the PN-EN 13791 code [2] and the PN-EN 1990 - Annex D [1].


2017 ◽  
Vol 4 (5) ◽  
pp. 1595
Author(s):  
Gomathi Priya J ◽  
Seenivasan Venkatasamy ◽  
Karamath S Pyarejan ◽  
Jayachandran K.

Background: Deficiency of 25 hydroxyvitamin D has been linked with predisposition to autoimmune disorders. Also, vitamin D has been found to be a causal factor in many autoimmune diseases. Objective of the study was to investigate vitamin D status in children with autoimmune thyroiditis attending endocrinology OPD at a tertiary centre in southern India.Methods: It is a case control study done in which 75 children (70 female, 5 male) with age and sex matched healthy controls were chosen. Free thyroxine, TSH, anti TPOAb, anti TGAb, 25 hydroxyvitamin D, serum calcium, phosphorus, alkaline phosphatase levels were estimated in both cases and control subjects. Children with anti TPO or anti TG positivity were divided into four groups based on their level of antibody titers.Results: The mean age in cases was 9.8±0.34 years. 25(OH)D levels were significantly lower in cases (15.07±1.14 ng/ml) compared to controls (17.82±1.13 ng/ml) (p<0.0006). Mean serum calcium levels in cases (9.35±0.16 mg/dl) were significantly lower when compared to controls (9.73±0.14 mg/dl) (p<0.0005). Similarly mean serum alkaline phosphatase level in cases (184.97±11.10 IU/L) were significantly elevated when compared with controls (122.37±6.82 IU/L) (p<0.0001). However, there was no significant difference in serum phosphorus levels between cases (4.42±0.10 mg/dl) and controls (4.43±0.14 mg/dl) (p=0.83). There was no significant difference in vitamin D level among the groups in both anti TPO (p< 0.283) and anti TG (p<0.148).Conclusions: The significant decrease in vitamin D levels in cases signifies that 25(OH)D may be an independent causal factor related to the autoimmunity in thyroid diseases. 


2021 ◽  
Vol 87 (12) ◽  
pp. 36-41
Author(s):  
A. S. Fedorov ◽  
E. L. Alekseeva ◽  
A. A. Alkhimenko ◽  
N. O. Shaposhnikov ◽  
M. A. Kovalev

Carbon dioxide (CO2) corrosion is one of the most dangerous types of destruction of metal products in the oil and gas industry. The field steel pipelines and tubing run the highest risk. Laboratory tests are carried out to assess the resistance of steels to carbon dioxide corrosion. However, unified requirements for certain test parameters are currently absent in the regulatory documentation. We present the results of studying the effect of the parameters of laboratory tests on the assessment of the resistance of steels to CO2 corrosion. It is shown that change in the parameters of CO2 concentration, chemical composition of the water/brine system, the buffer properties and pH, the roughness of the sample surface, etc., even in the framework of the same laboratory technique, can lead in different test results. The main contribution to the repeatability and reproducibility of test results is made by the concentration of CO2, pH of the water/brine system, and surface roughness of the samples. The results obtained can be used in developing recommendations for the choice of test parameters to ensure a satisfactory convergence of the results gained in different laboratories, as well as in elaborating of a unified method for assessing the resistance of steels to carbon dioxide corrosion.


Sign in / Sign up

Export Citation Format

Share Document