scholarly journals Blood Culturing Practices at an Academic Medical Center

2020 ◽  
Vol 41 (S1) ◽  
pp. s142-s143
Author(s):  
Priya Sampathkumar ◽  
Kyle Rodino ◽  
Stacy (Tram) Ung

Background: Blood cultures are part of the evaluation of hospital patients with fever. Patients with central lines in place, frequently have blood samples for culture drawn through lines. We sought to assess blood culturing practices at our institution. Methods: Retrospective review of BCs performed in hospitalized patients over a 12-month period (August 2018–July 2019) at an academic, tertiary-care center with 1,297 licensed beds and >62,000 admissions a year. A specialized phlebotomy team is involved in all peripherally drawn blood samples; however, the patient’s nurse obtains a blood sample through a central line. Results: Overall, 35,121 blood cultures were performed for an incidence rate of 106 BC per 1,000 patient days or 566 blood cultures per 1,000 admissions. Most blood samples (67%) were collected via peripheral venipuncture. We detected significant variation in culturing rates and the proportion of blood samples obtained through central lines among collecting units (Table 1). Overall, the blood culture contamination rate was 1.6%. Blood samples obtained through a central line had a higher contamination rate (2.2%) compared to samples obtained through peripheral venipuncture (1.3%; P < .0001). Blood culture rates were highest in intensive care units (ICUs) compared with other types of patient care units (Table 1). The blood culture positivity rate was significantly lower in ICUs (8.8%) compared with hematology-oncology (10%; HR, 0.88; CI, 0.80–0.96; P = .006), general medicine (10%; HR, 0.88; CI, 0.80–0.97; P = .013), and pediatrics (12%; HR, 0.74; CI, 0.59–0.92; P = .008). The ICUs had the lowest rate of BC contamination at 1.3%. Conclusions: Blood samples obtained through central lines for culture are more likely to be contaminated than peripherally drawn blood samples. Despite a relatively high rate of line-drawn blood samples for culture, ICUs had the lowest BC contamination rate, possibly reflecting high familiarity of ICU nurses with line draws. Blood samples collected through lines were most frequently performed in pediatrics and hematology-oncology, and these units had correspondingly higher rates of contamination. This information will be used to inform institutional guidelines on blood culturing and to identify ways to minimize blood culture contamination, which often results in additional testing and/or unnecessary antimicrobial use.Funding: NoneDisclosures: Consulting fee- Merck (Priya Sampathkumar)

2013 ◽  
Vol 34 (10) ◽  
pp. 1042-1047 ◽  
Author(s):  
John M. Boyce ◽  
Jacqueline Nadeau ◽  
Diane Dumigan ◽  
Debra Miller ◽  
Cindy Dubowsky ◽  
...  

Objective.Reduce the frequency of contaminated blood cultures that meet National Healthcare Safety Network definitions for a central line-associated bloodstream infection (CLABSI).Design.An observational study.Setting.A 500-bed university-affiliated hospital.Methods.A new blood culture policy discouraged drawing blood samples from central lines. Phlebotomists were reeducated regarding aseptic technique when obtaining blood samples by venipuncture. The intravenous therapy team was taught how to draw blood samples by venipuncture and served as a backup when phlebotomists were unable to obtain blood samples. A 2-nurse protocol and a special supply kit for obtaining blood samples from catheters were developed. Rates of blood culture contamination were monitored by the microbiology laboratory.Results.The proportion of blood samples obtained for culture from central lines decreased from 10.9% during January–June 2010 to 0.4% during July–December 2012 (P< .001). The proportion of blood cultures that were contaminated decreased from 84 (1.6%) of 5,274 during January–June 2010 to 21 (0.5%) of 4,245 during January–June 2012 (P< .001). Based on estimated excess hospital costs of $3,000 per contaminated blood culture, the reduction in blood culture contaminants yielded an estimated annualized savings of $378,000 in 2012 when compared to 2010. In mid-2010, 3 (30%) of 10 reported CLABSIs were suspected to represent blood culture contamination compared with none of 6 CLABSIs reported from mid-November 2010 through June 2012 (P= 0.25).Conclusions.Multiple interventions resulted in a reduction in blood culture contamination rates and substantial cost savings to the hospital, and they may have reduced the number of reportable CLABSIs.


Author(s):  
Vinitha Alex ◽  
Trusha Nana ◽  
Vindana Chibabhai

Abstract Background: Community-onset bloodstream infection (CO-BSI) is associated with substantial morbidity and mortality. Knowledge of locally prevalent pathogens and antimicrobial susceptibility patterns can promptly guide appropriate empiric therapy and improve outcomes. Objectives: We sought to determine the epidemiology of CO-BSI, the blood culture positivity rate and the contamination rate. We also sought to establish appropriateness of current empiric antimicrobial therapy practices. Methods: We retrospectively analyzed blood cultures taken from January 2015 to December 2019 at the emergency departments (EDs) of a tertiary-care academic hospital in South Africa using extracted laboratory data. Results: The overall positivity rate of blood cultures taken at the EDs was 15% (95% confidence interval [CI], 0.15–0.16) and the contamination rate was 7% (95% CI, 0.06–0.07). Gram-positive bacteria predominated in the pediatric cohort: neonates, 52 (54%) of 96; infants, 57 (52%) of 109; older children, 63 (61%) of 103. Methicillin-susceptible Staphylococcus aureus was the predominant pathogen among older children: 30 (35%) of 85. Escherichia coli was the most common pathogen isolated among adults and the elderly: 225 (21%) of 1,060 and 62 (29%) of 214, respectively. Among neonates, the susceptibility of E. coli and Klebsiella pneumoniae to the combination of ampicillin and gentamicin was 17 (68%) of 25. Among adults, the susceptibility of the 5 most common pathogens to amoxicillin-clavulanate was 426 (78%) of 546 and their susceptibility to ceftriaxone was 481 (85%) of 565 (P = .20). The prevalence of methicillin-resistant S. aureus, extended-spectrum β-lactamase–producing and carbapenem-resistant Enterobacterales were low among all age groups. Conclusions: Review of blood culture collection techniques is warranted to reduce the contamination rate. High rates of resistance to currently prescribed empiric antimicrobial agents for CO-BSI warrants a re-evaluation of local guidelines.


1993 ◽  
Vol 14 (6) ◽  
pp. 325-330 ◽  
Author(s):  
Farrin A. Manian ◽  
Lynn Meyer ◽  
Joan Jenne

AbstractObjective:To better assess the risk of exposure to bloodborne pathogens following puncture injuries due to needles removed from intravenous(IV) lines.Setting:Tertiary care community medical center.Patients:A convenience sample of hospitalized patients requiring IV piggy-back medications.Methods:Examination of 501 IV ports of peripheral lines, heparin-locks, and central venous lines for visible blood and testing the residual fluid in the needles removed from these ports for the presence of occult blood by using guaiac-impregnated paper.Results:The proximal ports of central venous lines and heparin-locks were statistically more likely to contain visible blood than proximal and distal ports of peripheral lines (17% and 20% versus 1% and 3% respectively, P<0.05). Similarly, needles removed from proximal ports of central venous lines and heparin-locks were statistically more likely to contain occult blood than those from peripheral lines ( 11% and 14% versus 2%, respectively, P<0.05). Only two needles removed from IV lines without visible blood contained occult blood: one from the proximal port of a central line and another from a heparin-lock. None of the needles from peripheral lines without visible blood contained occult blood.Estimation of the risk of transmission of hepatitis B and C and human immunodeficiency virus (HIV) following injury by needles from various IV lines revealed that injury due to needles removed from peripheral IV lines and distal ports of central lines without visible blood was associated with “near zero” risk of transmission of these bloodborne infections at our medical center.Conclusions:Routine serological testing of source patients involving injury due to needles removed from peripheral IV lines and distal ports of central lines without visible blood is not necessary at our medical center. Conversely, due to the relatively high rate of occult blood in the needles removed from proximal ports of central venous lines and heparin-locks, puncture injuries due to these needles are considered significant and managed accordingly.


Author(s):  
Jennifer LeRose ◽  
Avnish Sandhu ◽  
Jordan Polistico ◽  
Joe Ellsworth ◽  
Mara Cranis ◽  
...  

Abstract A comparative retrospective study to quantify the impact of Coronavirus Disease 2019 (COVID-19) on patient safety. We found a statistically significant increase in central line-associated blood stream infections and blood culture contamination rates during the pandemic. Increased length of stay and mortality was also observed during COVID-19.


2021 ◽  
Vol 8 (Supplement_1) ◽  
pp. S65-S65
Author(s):  
Jordan Resnick ◽  
Emad A Chishti ◽  
Mahesh Bhatt ◽  
Thein Myint

Abstract Background Cryptococcal meningitis (CM) is a life-threatening condition that requires prompt recognition and management. With high morbidity in mind, we elected to compare the key CSF analysis, blood culture and serum cryptococcal antigen (CrAg) to prognosticate the probability of mortality in this population. Table 1. Comparison of demographics, serum and CSF analysis Methods We retrospectively reviewed all charts of patients admitted to our tertiary care center from 10/2005 to 10/2017. Inclusion criteria encompassed patients with positive CSF CrAg, positive CSF cultures, India ink, cytopathology, or CSF cell count &gt;5 with CNS symptoms, positive serum CrAg titer or blood cultures. Results Sixty patients who met the inclusion criteria were divided into the survivor (n=41) and the non-survivor (n=19) groups based on the inpatient mortality. There was no difference in age, sex, and immune status between the two groups. The median CSF nucleated cell counts in the non-survivor group was 39 cells/µL with median lymphocyte 59.5% whereas in the survivor group was 72 cells/µL with median lymphocyte 76% (P&lt; 0.001 and 0.04 respectively). The median CSF glucose was 27 mg/ml in the non-survivor compared to 35 mg/ml in the survivor group (P=0.02). Median CSF CrAg was higher at 1:1024 in the non-survivor group whereas the survivor group was 1:256 (P &lt; 0.01). CSF opening pressure (cm H2O), blood culture, and serum CrAg level were not statistically significant between the two groups. Conclusion Low CSF cell count, low glucose, and high CSF CrAg were independently associated with inpatient mortality in CM. This is in line with the prior findings. A novel finding in this study is significantly decreased median CSF lymphocyte % in the non-survivor group. Serum CrAg titer, positive blood cultures, and median CSF protein were not statistically significant between the two groups. However, a study with a larger sample size may be needed to confirm these findings. Disclosures All Authors: No reported disclosures


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S418-S419
Author(s):  
Jerry Jacob ◽  
Ann Morace ◽  
Jisuk Park ◽  
Nina Renzi

Abstract Background Long-term acute care hospitals (LTACHs) care for chronically, critically ill patients with high utilization of central lines and high risk for morbidity from central line-associated bloodstream infections (CLABSIs). Our 38-bed LTACH noted a substantial increase in the incidence of CLABSIs (as defined by the National Healthcare Safety Network) between fiscal year (FY) 2016 and FY 2018 (Figure 1). Detailed case review identified a large number of CLABSIs which were clinically consistent with blood culture contaminants from central lines. Feedback from bedside staff also suggested gaps between practice and evidence-based measures for central line care. Methods A three-pronged CLABSI prevention project was implemented in July 2018 consisting of (1) staff education regarding daily chlorhexidine (CHG) bathing for all patients, combined with an electronic audit report to identify patients without active CHG orders; (2) change in practice to the use of venipuncture alone for blood culture collection, combined with an electronic audit report to identify blood cultures collected from central lines; and (3) a recurring 6-part educational series for nurses focused on central line care. The pre-intervention period was defined as the 12-month period between July 1, 2017 and June 30, 2018 (FY 2018). The primary outcome was the fiscal year CLABSI rate. A secondary outcome was the proportion of blood cultures drawn from central lines. Results After 9 months of the intervention, one CLABSI had been reported for FY 2019 year-to-date at a rate of 0.4 per 1,000 CL-days, representing an 86% decrease from the FY 2018 rate of 2.8 per 1,000 CL-days. The 12-month rolling CLABSI rate decreased to 1.6 per 1,000 CL-days (Figure 2). The proportion of blood cultures collected from central lines decreased from 10.5% (69/658) to 4.5% (15/334), representing a 57% reduction. The proportion of patients ordered and receiving CHG bathing in the intervention period was >95%. Conclusion A multidisciplinary effort focused on CHG bathing, central line care, and blood culture collection led to a substantial reduction in CLABSIs in our LTACH. The use of electronic audit reports was particularly useful in achieving high adherence to practice changes. Disclosures All authors: No reported disclosures.


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S734-S734
Author(s):  
Alex Carignan ◽  
Kevin Dufour ◽  
Catherine Beauregard-Paultre ◽  
Philippe Martin

Abstract Background The potential delays caused by transport of blood cultures to server laboratories might result in delayed issuance of results for patients with positive blood cultures. In this study, we aimed to determine the clinical impacts of inter-site transport of blood cultures. Methods We performed a retrospective cohort study involving cases with positive blood cultures (1 positive blood culture/species/patient/7 days; not deemed as a contaminant) at two sites of a Canadian tertiary care center between January 1, 2018 and December 31, 2018. Blood cultures from the affiliated site were transported to the laboratory of the primary server site. These two sites are located 8 km apart. The following outcomes were studied: the duration between blood culture sampling and issuance of the first report and the duration between blood culture sampling and administration of the first effective antibiotic. Results We observed 349 episodes of bacteremia, including 161 in the affiliated site (45.5%) and 193 in the primary server center (54.5%). Enterobacteriaceae (n = 151, 43%) and Staphylococcus aureus (n = 77, 22%) were the most commonly observed causative bacteria. Median duration for issuance of the first positive report was significantly shorter in the primary server hospital (32.4 h, interquartile range [IQR] 19.8–44.3) than in the affiliated center (37.9 h, IQR 24.1–46.5; P = 0.004). The median duration between blood culture sampling and administration of the first effective antibiotic was 2.7 h in the server site (IQR 0.75–15.2) and 2.3 h in the affiliated site (IQR 1–8.45) (P = 1.0). Receiving the first effective antibiotic after blood culture sampling required > 60 min in 8/189 patients (4.2%) in the affiliated site and 9/158 patients (5.7%) in the primary server site (P = 0.3). The 30-day mortality was 13.8% (26/189) and 8.9% (14/158) at the primary server site and affiliated site, respectively (P = 0.16). Conclusion Inter-site transport of blood cultures is associated with a significant delay in the issuance of positive blood culture reports. However, this delay does not cause any delay in administration of effective antibiotic therapy because of rapid recognition of sepsis in bacteremia patients. These results are reassuring in the context of increasing microbiology service centralization. Disclosures All authors: No reported disclosures.


Author(s):  
Justin M. Klucher ◽  
Kevin Davis ◽  
Mrinmayee Lakkad ◽  
Jacob T. Painter ◽  
Ryan K. Dare

Abstract Objective: To determine patient-specific risk factors and clinical outcomes associated with contaminated blood cultures. Design: A single-center, retrospective case-control risk factor and clinical outcome analysis performed on inpatients with blood cultures collected in the emergency department, 2014–2018. Patients with contaminated blood cultures (cases) were compared to patients with negative blood cultures (controls). Setting: A 509-bed tertiary-care university hospital. Methods: Risk factors independently associated with blood-culture contamination were determined using multivariable logistic regression. The impacts of contamination on clinical outcomes were assessed using linear regression, logistic regression, and generalized linear model with γ log link. Results: Of 13,782 blood cultures, 1,504 (10.9%) true positives were excluded, leaving 1,012 (7.3%) cases and 11,266 (81.7%) controls. The following factors were independently associated with blood-culture contamination: increasing age (adjusted odds ratio [aOR], 1.01; 95% confidence interval [CI], 1.01–1.01), black race (aOR, 1.32; 95% CI, 1.15–1.51), increased body mass index (BMI; aOR, 1.01; 95% CI, 1.00–1.02), chronic obstructive pulmonary disease (aOR, 1.16; 95% CI, 1.02–1.33), paralysis (aOR 1.64; 95% CI, 1.26–2.14) and sepsis plus shock (aOR, 1.26; 95% CI, 1.07–1.49). After controlling for age, race, BMI, and sepsis, blood-culture contamination increased length of stay (LOS; β = 1.24 ± 0.24; P < .0001), length of antibiotic treatment (LOT; β = 1.01 ± 0.20; P < .001), hospital charges (β = 0.22 ± 0.03; P < .0001), acute kidney injury (AKI; aOR, 1.60; 95% CI, 1.40–1.83), echocardiogram orders (aOR, 1.51; 95% CI, 1.30–1.75) and in-hospital mortality (aOR, 1.69; 95% CI, 1.31–2.16). Conclusions: These unique risk factors identify high-risk individuals for blood-culture contamination. After controlling for confounders, contamination significantly increased LOS, LOT, hospital charges, AKI, echocardiograms, and in-hospital mortality.


2020 ◽  
Vol 41 (S1) ◽  
pp. s195-s195
Author(s):  
Josephine Fox ◽  
Robert Russell ◽  
Lydia Grimes ◽  
Heather Gasama ◽  
Carrie Sona ◽  
...  

Background: Proper care and maintenance of central lines is essential to prevent central-line–associated bloodstream infections (CLABSI). Our facility implemented a hospital-wide central-line maintenance bundle based on CLABSI prevention guidelines. The objective of this study was to determine whether maintenance bundle adherence was influenced by nursing shift or the day of week. Methods: A central-line maintenance bundle was implemented in April 2018 at a 1,266-bed academic medical center. The maintenance bundle components included alcohol-impregnated disinfection caps on all ports and infusion tubing, infusion tubing dated, dressings, not damp or soiled, no oozing at insertion site greater than the size of a quarter, dressings occlusive with all edges intact, transparent dressing change recorded within 7 days, and no gauze dressings in place for >48 hours. To monitor bundle compliance, 4 non–unit-based nurse observers were trained to audit central lines. Observations were collected between August 2018 and October 2019. Observations were performed during all shifts and 7 days per week. Just-in-time feedback was provided for noncompliant central lines. Nursing shifts were defined as day (7:00 a.m. to 3:00 p.m.), evening (3:00 p.m. to 11:00 p.m.), and night (11:00 p.m. to 7:00 a.m.). Central-line bundle compliance between shifts were compared using multinomial logistic regression. Bundle compliance between week day and weekend were compared using Mantel-Haenszel 2 analysis. Results: Of the 25,902 observations collected, 11,135 (42.9%) were day-shift observations, 11,559 (44.6%) occurred on evening shift, and 3,208 (12.4%) occurred on the night shift. Overall, 22,114 (85.9%) observations occurred on a week day versus 3,788 (14.6%) on a Saturday or Sunday (median observations per day of the week, 2,570; range, 1,680–6,800). In total, 4,599 CLs (17.8%) were noncompliant with >1 bundle component. The most common reasons for noncompliance were dressing not dated (n = 1,577; 44.0%) and dressings not occlusive with all edges intact (n = 1340; 37.4%). The noncompliant rates for central-line observations by shift were 12.8% (1,430 of 1,1,135) on day shift, 20.4% (2,361 of 11,559) on evening shift, and 25.2% (808 of 3,208) on night shift. Compared to day shift, evening shift (OR, 1.74; 95% CI, 1.62–1.87; P < .001) and night shift (OR, 2.29; 95% CI, 2.07–2.52; P < .001) were more likely to have a noncompliant central lines. Compared to a weekday, observations on weekend days were more likely to find a noncompliant central line: 914 of 3,788 (24.4%) weekend days versus 3,685 of 22,114 (16.7%) week days (P < .001). Conclusions: Noncompliance with central-line maintenance bundle was more likely on evening and night shifts and during the weekends.Funding: NoneDisclosures: None


2021 ◽  
Vol 1 (S1) ◽  
pp. s36-s36
Author(s):  
Connie Schaefer

Background: Blood culture is a crucial diagnostic tool for healthcare systems, but false-positive results drain clinical resources, imperil patients with an increased length of stay (and associated hospital-acquired infection risk), and undermine global health initiatives when broad-spectrum antibiotics are administered unnecessarily. Considering emerging technologies that mitigate human error factors, we questioned historically acceptable rates of blood culture contamination, which prompted a need to promote and trial these technologies further. In a 3-month trial, 3 emergency departments in a midwestern healthcare system utilized an initial specimen diversion device (ISDD) to draw blood cultures to bring their blood culture contamination rate (4.4% prior to intervention) below the 3% benchmark recommended by the Clinical & Laboratory Standards Institute. Methods: All emergency department nursing staff received operational training on the ISDD for blood culture sample acquisition. From June through August 2019, 1,847 blood cultures were drawn via the ISDD, and 862 were drawn via the standard method. Results: In total, 16 contamination events occurred when utilizing the ISDD (0.9%) and 37 contamination events occurred when utilizing the standard method (4.3%). ISDD utilization resulted in an 80% reduction in blood culture contamination from the rate of 4.4% rate held prior to intervention. Conclusions: A midwestern healthcare system experienced a dramatic reduction in blood culture contamination across 3 emergency departments while pilot testing an ISDD, conserving laboratory and therapeutic resources while minimizing patient exposure to unnecessary risks and procedures. If the results obtained here were sustained and the ISDD utilized for all blood culture draws, nearly 400 contamination events could be avoided annually in this system. Reducing unnecessary antibiotic use in this manner will lower rates of associated adverse events such as acute kidney injury and allergic reaction, which are possible topics for further investigation. The COVID-19 pandemic has recently highlighted both the importance of keeping hospital beds available and the rampant carelessness with which broad-spectrum antibiotics are administered (escalating the threat posed by multidrug-resistant organisms). As more ambitious healthcare benchmarks become attainable, promoting and adhering to higher standards for patient care will be critical to furthering an antimicrobial stewardship agenda and to reducing treatment inequity in the field.Funding: NoDisclosures: None


Sign in / Sign up

Export Citation Format

Share Document