scholarly journals Improving Pediatric Outcomes through Intravenous and Oral Medication Standardization

2009 ◽  
Vol 14 (4) ◽  
pp. 226-235
Author(s):  
Mark W. MacKay ◽  
Jared Cash ◽  
Fred Farr ◽  
Marc Holley ◽  
Kevin Jones ◽  
...  

BACKGROUND Standardization is an invaluable tool to promote safety, improve care, and decrease costs, which ultimately improves outcomes. However, a pediatric setting presents unique challenges with its wide variety of weights, medications, and needs that are distinctly different. Our goal was to develop and implement standards in complex high risk areas that show improved outcomes and safety. PROGRAM DESCRIPTION A computerized prescriber order entry program with decision support for pediatrics was developed for parenteral nutrition prescribing. The program included dosing, calculations, calcium phosphate compatibility checks, automated IV compounder interface, osmolarity route calculation, end product testing verification, aluminum exposure and many other quality improvements. This same electronic order program, interface to sterile compounders, and end product testing was used to standardize and make common non-manufactured intravenous solutions. The drip compounding process was reengineered to include standard concentrations, label changes, and beta-testing of a smart syringe pump with dosing ranges for pediatrics. Common standard oral doses were developed along with standard oral formulations. CONCLUSIONS Total parenteral nutrition (TPN) error rates decreased from 7% to less than 1% and compatibility issues decreased from 36 to 1 per year. Neonatal osteopenia rates decreased from 15% to 2%. Results from end product testing of TPN solutions were within USP standards showing statistical correlation (p<0.001). Intravenous standardization decreased error rates by 15% and compounding time decreased by 12 minutes (64%). Drip standardization allowed for drug concentration and smart pump standardization and decreased drip errors by 73% from 3.1 to 0.8 per 1000 doses. Compounding errors decreased from 0.66 to 0.16 per 1000 doses and ten-fold errors decreased from 0.41 to 0.08 per 1000 doses. Eleven oral liquids, including 329 different doses, were standardized, decreasing the number of doses to 59 (83% change). This decreased workload 15%, wastage 90%, improved turnaround time 32%, and saved $15,000/year. One hundred evidence-based standard oral formulations were developed and used in 22 different hospitals.

2011 ◽  
Vol 16 (2) ◽  
pp. 92-97
Author(s):  
Robert L. Poole ◽  
Kevin P. Pieroni ◽  
Shabnam Gaskari ◽  
Tessa K. Dixon ◽  
KT Park ◽  
...  

ABSTRACT OBJECTIVE Aluminum is a contaminant in all parenteral nutrition solutions. Manufacturers currently label these products with the maximum aluminum content at the time of expiry, but there are no published data to establish the actual measured concentration of aluminum in parenteral nutrition solution products prior to being compounded in the clinical setting. This investigation assessed quantitative aluminum content of products commonly used in the formulation of parenteral nutrition solutions. The objective of this study is to determine the best products to be used when compounding parenteral nutrition solutions (i.e., those with the least amount of aluminum contamination). METHODS All products available in the United States from all manufacturers used in the production of parenteral nutrition solutions were identified and collected. Three lots were collected for each identified product. Samples were quantitatively analyzed by Mayo Laboratories. These measured concentrations were then compared to the manufacturers' labeled concentration. RESULTS Large lot-to-lot and manufacturer-to-manufacturer differences were noted for all products. Measured aluminum concentrations were less than manufacturer-labeled values for all products. CONCLUSIONS The actual aluminum concentrations of all the parenteral nutrition solutions were significantly less than the aluminum content based on manufacturers' labels. These findings indicate that 1) the manufacturers should label their products with actual aluminum content at the time of product release rather than at the time of expiry, 2) that there are manufacturers whose products provide significantly less aluminum contamination than others, and 3) pharmacists can select products with the lowest amounts of aluminum contamination and reduce the aluminum exposure in their patients.


Nutrients ◽  
2012 ◽  
Vol 4 (11) ◽  
pp. 1566-1574 ◽  
Author(s):  
Robert Poole ◽  
Kevin Pieroni ◽  
Shabnam Gaskari ◽  
Tessa Dixon ◽  
John Kerner

Author(s):  
Pierre-Olivier Hétu ◽  
Sacha Hobeila ◽  
François Larivière ◽  
Marie-Claire Bélanger

Abstract Background Serum is commonly used for clinical chemistry testing but many conditions can affect the clotting process, leading to poor sample quality and impaired workflow. With serum gel tubes, we found a high proportion of sample probe aspiration errors on our Beckman AU5800 analyzers. We decided to implement the BD Barricor™ plasma tubes, and we validated an off-specification centrifugation scheme and verified that results obtained for 65 chemistry and immunochemistry tests were comparable to those obtained in serum gel tubes. Finally, we evaluated the impact of this new tube on sample error rate and laboratory turnaround time. Methods To validate centrifugation settings, 50 paired samples were collected in Barricor tubes and centrifuged at 1912 × g for 10 min or 5 min (off-specification). To compare serum gel tubes with Barricor plasma tubes, 119 paired samples were collected from volunteers and results were analyzed using weighed Deming regression. Finally, the proportion of aspiration errors and laboratory TAT for potassium were measured before and after implementing Barricor tubes. Results Barricor tubes showed clinically acceptable equivalence to serum gel tubes for the studied analytes, and the off-specification centrifugation scheme did not affect the results. Implementing Barricor tubes improved the laboratory workflow by decreasing the aspiration error rates (2.01% to 0.77%, P < 0.001) and lowering hemolysis (P < 0.001). The laboratory TAT for potassium were also significantly lowered (P < 0.001). Conclusion Use of Barricor tubes instead of serum gel tubes leads to better sample quality, shorter more reproducible laboratory TAT, and decreases costs associated with error management.


2008 ◽  
Vol 42 (10) ◽  
pp. 1410-1415 ◽  
Author(s):  
Rex O Brown ◽  
Laurie M Morgan ◽  
Syamal K Bhattacharya ◽  
Patti L Johnson ◽  
Gayle Minard ◽  
...  

Background: Patients' exposure to and potential toxicity from aluminum in parenteral nutrition (PN) formulations is an important concern of healthcare providers. Objective: To determine the potential for aluminum toxicity caused by PN in hospitalized adults who have risk factors of both acute kidney injury and PN. Methods: Adults who required PN and had a serum creatinine (SCr) level at least 1.5 times greater than the admission SCr on the first day of PN were studied in a retrospective fashion. Protein was administered based on whether hemodialysis was being used (0 6-1 g/kg/day without hemodialysis; 1.2-1.5 g/kg/day with hemodialysis). Aluminum exposure was determined for each patient by multiplying the volume of each PN component by its concentration of aluminum Unpaired f-tests, Fisher's exact test, and analysis of variance were used for statistical analysis. Data are presented as mean ± SD. Results: Thirty-six patients (aged 50.4 ± 20.4 y; weight 90.2 ± 32.8 kg) were studied. Initial serum urea nitrogen and SCr were 47 ± 23 and 3.3 ± 1.4 mg/dL. respectively. Twelve patients received hemodialysis. The mean aluminum exposure was 3.8 ± 2 μg/kg/day in the 36 patients, Of these, 29 had safe calculated aluminum exposure (<5 μg/kg/day) and 7 had high calculated aluminum exposure (>5 μg/kg/day), Patients with safe aluminum exposure had significantly higher SCr levels than did those with high aluminum exposure (3.5 ± 1.5 vs 2.2 ± 0.7 mg/dL; p < 0.04). Patients with high aluminum exposure received significantly more aluminum from calcium gluconate compared with those who had safe aluminum exposure (357 ± 182 vs 250 ± 56 μg/day; p < 0.02). Limitations of the study include its retrospective design, which resulted in calculated versus direct measurement of aluminum. Conclusions: Using our calculations, we believe that most patients with acute kidney injury who require PN do not receive excessive exposure to aluminum from the PN formulation, despite having 2 risk factors (acute kidney injury, PN) for aluminum toxicity,


2017 ◽  
Vol 7 (1) ◽  
pp. 1103-1110 ◽  
Author(s):  
A Lakhey ◽  
H Shakya

Background: Neonatal sepsis, a clinical syndrome of bacteremia with systemic signs and symptoms of infection in the first 4 weeks of life is a major cause of morbidity and mortality in newborn inborn. Early diagnosis is critical, as sepsis can progress more rapidly in neonates than in adults. An attempt was made to establish correlation between early neonatal sepsis screening & blood culture in neonates presenting with features of sepsis. The aim of this study is to assess the usefulness of sepsis screen in early diagnosis of neonatal septicemia.Materials and Methods: The study was done in Kist medical college and hospital, Nepal from October 2015 to October 2016.  Statistical correlation between early indicators of sepsis screen & blood culture (considered as gold standard) was established in clinically suspicious cases of neonatal sepsis. Results: Out of 150 cases studied, 72 were culture positive. CRP (77.8%) and immature: total neutrophils ratio (73%) showed highest sensitivity. CRP (66.7%), I/T ratio (61.5%) and micro ESR (60.2%) showed highest specificity. Positive predictive value was highest for CRP (68.2%) followed by I/T ratio (63.8%) and corrected total leukocyte count (56.2%)Conclusion: Serum CRP is the most sensitive marker of sepsis. Use of peripheral smear study and CRP can be implicated effectively as a sepsis screen for early diagnosis of neonatal sepsis. The combination of parameters yielded better results than single tests and proved to be an invaluable tool for early diagnosis of neonatal sepsis. 


2015 ◽  
Vol 23 (e1) ◽  
pp. e169-e179 ◽  
Author(s):  
Tobias Hodgson ◽  
Enrico Coiera

Abstract Objective To review literature assessing the impact of speech recognition (SR) on clinical documentation. Methods Studies published prior to December 2014 reporting clinical documentation using SR were identified by searching Scopus, Compendex and Inspect, PubMed, and Google Scholar. Outcome variables analyzed included dictation and editing time, document turnaround time (TAT), SR accuracy, error rates per document, and economic benefit. Twenty-three articles met inclusion criteria from a pool of 441. Results Most studies compared SR to dictation and transcription (DT) in radiology, and heterogeneity across studies was high. Document editing time increased using SR compared to DT in four of six studies (+1876.47% to –16.50%). Dictation time similarly increased in three of five studies (+91.60% to –25.00%). TAT consistently improved using SR compared to DT (16.41% to 82.34%); across all studies the improvement was 0.90% per year. SR accuracy was reported in ten studies (88.90% to 96.00%) and appears to improve 0.03% per year as the technology matured. Mean number of errors per report increased using SR (0.05 to 6.66) compared to DT (0.02 to 0.40). Economic benefits were poorly reported. Conclusions SR is steadily maturing and offers some advantages for clinical documentation. However, evidence supporting the use of SR is weak, and further investigation is required to assess the impact of SR on documentation error types, rates, and clinical outcomes.


Nutrients ◽  
2017 ◽  
Vol 9 (11) ◽  
pp. 1249 ◽  
Author(s):  
Megan Fortenberry ◽  
Lela Hernandez ◽  
Jacob Morton

Sign in / Sign up

Export Citation Format

Share Document