Impact of a Rapid Blood Culture Diagnostic Panel on Time to Optimal Antimicrobial Therapy at a Veterans Affairs Medical Center

2021 ◽  
pp. 089719002110006
Author(s):  
Jordan M. Chiasson ◽  
Winter J. Smith ◽  
Tomasz Z. Jodlowski ◽  
Marcus A. Kouma ◽  
James B. Cutrell

Purpose: Utilization of rapid diagnostic testing alongside intensive antimicrobial stewardship interventions improves patient outcomes. We sought to determine the clinical impact of a rapid blood culture identification (BCID) panel in an established Antimicrobial Stewardship Program (ASP) with limited personnel resources. Methods: A single center retrospective pre- and post-intervention cohort study was performed following the implementation of a BCID panel on patients admitted with at least 1 positive blood culture during the study period. The primary outcome was time to optimal therapy from blood culture collection. Secondary outcomes included days of therapy (DOT), length of stay, and 30-day mortality and readmission rates. Results: 277 patients were screened with 180 patients included, with 82 patients in the pre-BCID and 98 in the post-BCID arms. Median time to optimal therapy was 73.8 hours (IQR; 1.1-79.6) in the pre-BCID arm and 34.7 hours (IQR; 10.9-71.6) in the post-BCID arm (p ≤ 0.001). Median DOT for vancomycin was 4 and 3 days (p ≤ 0.001), and for piperacillin-tazobactam was 3.5 and 2 days (p ≤ 0.007), for the pre-BCID and post-BCID arms, respectively. Median length of hospitalization was decreased from 11 to 9 days (p = 0.031). No significant change in 30-day readmission rate was noted, with a trend toward lower mortality (12% vs 5%; p = 0.086). Conclusion: Introduction of BCID into the daily workflow resulted in a significant reduction in time to optimal therapy for bloodstream infections and DOT for select broad-spectrum antibiotics, highlighting the potential benefits of rapid diagnostics even in settings with limited personnel resources.

2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S671-S671
Author(s):  
Jordan Chiasson ◽  
James B Cutrell ◽  
James B Cutrell ◽  
Jodlowski Tomasz ◽  
Winter Smith ◽  
...  

Abstract Background Rapid blood culture diagnostics can improve patient outcomes, particularly when paired with robust interventions such as 24/7 stewardship coverage. We sought to determine the clinical impact of a rapid blood culture identification (BCID) panel (BioFire® FilmArray Multiplex PCR) in an established antimicrobial stewardship program (ASP). In addition to clinician education, BCID results were reviewed by the ASP team during weekday business hours, for an average of 2 hours daily based on availability. Methods Data on demographics, blood cultures, antimicrobial use, length of stay and mortality were collected on inpatients at the VA North Texas Health Care System with at least one positive blood culture for bacterial or yeast isolates from March 2017 to June 2017 (pre-BCID) and from March 2018 to June 2018 (post-BCID). The primary outcome was a composite of time to optimal therapy from blood culture collection, defined as escalation, de-escalation, discontinuation, or optimization of antimicrobials retrospectively adjudicated based on final culture results. Secondary outcomes included time to effective therapy, total days of therapy (DOT), length of stay, and 30-day mortality and readmission rates. Results 195 patients were screened with 130 patients included in the study. No significant differences in baseline characteristics were observed between groups (Table 1). Sixty-one patients were included in the pre-BCID arm and 69 in the post-BCID arm. Median time to optimal therapy was 82.9 hours (IQR; 12.8–99.8) in the pre-BCID arm and 33.9 hours (IQR; 11.2–64.8) in the post-BCID arm (P = 0.005) (Table 2). No significant change in 30-day mortality or 30-day readmission rates was noted. Vancomycin DOT was 4 days (IQR; 2–5) and 3 days (IQR; 1–4) (P = 0.024), and piperacillin–tazobactam DOT was 4 (IQR; 0–5) and 2 (IQR; 0–4) (P = 0.043), in the pre-BCID and post-BCID groups, respectively (Figure 1). Conclusion Introduction of BCID into the daily workflow of our ASP resulted in a significant reduction in time to optimal therapy for bloodstream infections. DOT for select broad-spectrum antibiotics were also significantly reduced. This study highlights the potential benefit of rapid diagnostics without negative impact to patient care even in settings without resources for 24/7 ASP review. Disclosures All authors: No reported disclosures.


Author(s):  
Evan D Robinson ◽  
Allison M Stilwell ◽  
April E Attai ◽  
Lindsay E Donohue ◽  
Megan D Shah ◽  
...  

Abstract Background Implementation of the Accelerate PhenoTM Gram-negative platform (RDT) paired with antimicrobial stewardship program (ASP) intervention projects to improve time to institutional-preferred antimicrobial therapy (IPT) for Gram-negative bacilli (GNB) bloodstream infections (BSIs). However, few data describe the impact of discrepant RDT results from standard of care (SOC) methods on antimicrobial prescribing. Methods A single-center, pre-/post-intervention study of consecutive, nonduplicate blood cultures for adult inpatients with GNB BSI following combined RDT + ASP intervention was performed. The primary outcome was time to IPT. An a priori definition of IPT was utilized to limit bias and to allow for an assessment of the impact of discrepant RDT results with the SOC reference standard. Results Five hundred fourteen patients (PRE 264; POST 250) were included. Median time to antimicrobial susceptibility testing (AST) results decreased 29.4 hours (P < .001) post-intervention, and median time to IPT was reduced by 21.2 hours (P < .001). Utilization (days of therapy [DOTs]/1000 days present) of broad-spectrum agents decreased (PRE 655.2 vs POST 585.8; P = .043) and narrow-spectrum beta-lactams increased (69.1 vs 141.7; P < .001). Discrepant results occurred in 69/250 (28%) post-intervention episodes, resulting in incorrect ASP recommendations in 10/69 (14%). No differences in clinical outcomes were observed. Conclusions While implementation of a phenotypic RDT + ASP can improve time to IPT, close coordination with Clinical Microbiology and continued ASP follow up are needed to optimize therapy. Although uncommon, the potential for erroneous ASP recommendations to de-escalate to inactive therapy following RDT results warrants further investigation.


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S704-S704
Author(s):  
David Ha ◽  
Mary Bette Forte ◽  
Victoria Broberg ◽  
Rita Olans ◽  
Richard Olans ◽  
...  

Abstract Background Minimal literature exists to demonstrate the quantitative impact of bedside nurses in antimicrobial stewardship (AMS). We initiated bedside nurse-driven AMS and infection prevention (AMS/IP) rounds on three inpatient telemetry units of a community regional medical center. Rounds were nurse-driven, involved an infectious diseases (ID) pharmacist and infection preventionist, and were designed to complement traditional ID pharmacist and ID physician AMS rounds. Rounds were focused on use of antibiotics, urinary catheters (UCs), and central venous catheters (CVCs). Recommendations from rounds were communicated by the bedside nurse either directly to providers or to the ID pharmacist and ID physician for intervention. Methods This was an observational, multiple-group, quasi-experimental study conducted over 3.5 years (July 2015 to December 2018) to characterize the impact of bedside nurse-driven AMS/IP rounds on antibiotic, urinary catheter and CVC use, hospital-onset C. difficile infection (CDI), catheter-associated urinary tract infections (CAUTI), and central line-associated bloodstream infections (CLABSI). Outcomes were assessed in two cohorts based on time of AMS/IP rounds implementation (Cohort 1 implemented on one telemetry unit in July 2016, Cohort 2 implemented in two telemetry units in January 2018). Results A total of 2,273 patient therapy reviews occurred (Cohort 1: 1,736; Cohort 2: 537). Of these reviews, 1,209 (53%) were antibiotics, 879 (39%) were urinary catheters, and 185 (8%) were CVCs. Pre- vs. post-intervention, significant reductions were observed in both cohorts for mean monthly antibiotic days of therapy per 1,000 patient-days (Cohort 1: 791 vs. 688, P < 0.001; Cohort 2: 615 vs. 492, P < 0.001), UC days per patient day (Cohort 1: 0.25 vs. 0.16, P < 0.001; Cohort 2: 0.19 vs. 0.14, P < 0.001), CVC days per patient day (Cohort 1: 0.15 vs. 0.11, = 0.002; Cohort 2: 0.09 vs. 0.07, p = 0.005), and CDI per 10,000 patient-days (Cohort 1: 17.8 vs. 7.1, p = 0.035; Cohort 2: 19.1 vs. 5.4, p = 0.003). Numerical reductions were observed in CAUTI and CLABSI per 10,000 patient-days. Conclusion Bedside nurses can improve AMS and IP outcomes in a scalable fashion when supported by an interdisciplinary AMS/IP team and are complimentary to traditional AMS and IP practices. Disclosures All authors: No reported disclosures.


2021 ◽  
Vol 8 (Supplement_1) ◽  
pp. S423-S424
Author(s):  
Sharon Blum ◽  
Terrence McSweeney ◽  
Samad Tirmizi ◽  
brian Auditore ◽  
Diane Johnson ◽  
...  

Abstract Background Bloodstream infections are a major cause of morbidity and mortality in hospitalized patients. Prompt initiation of effective antimicrobials are essential to optimize patient outcomes. New diagnostic technologies rapidly identifying bacteria, viruses, fungi, and parasites in infections of various body sites. There is a paucity of literature determining if stewardship programs run by one trained pharmacist with rapid diagnostics decreases time to optimal antimicrobial therapy. Methods This was a retrospective chart review of positive bloodstream infections identified via rapid diagnostic technologies. The EHR of admitted adult patients with positive BSI identified by BioFire FilmArray Blood Culture Identification (BCID) Panel™ or Accelerate PhenoTest Blood Culture kit™2 between January 2018 – July 2019 were evaluated and pertinent data was collected. Results Rapid diagnostic technologies identified 108 bloodstream infections due to gram positive, 56 due to gram negative, and 6 due to Candida organisms. Mean time to optimal antimicrobial therapy was significantly lower when pharmacist recommendation was accepted versus when primary care team consulted ID for recommendation or did not accept pharmacist recommendation. Mean time to optimal therapy was 14.7, 34.3, and 271.3 hours (p&lt; 0.0001) respectively. Median total cost of visit per patient, calculated using the average wholesale price of antibiotics multiplied by the number of doses received, was significantly lower when pharmacist recommendations were accepted (&86.40, &147.95, and &239.41, respectively). Baseline characteristics Microbiological isolates Primary Outcome: Time to Optimal Therapy Conclusion The establishment of a pharmacist run antimicrobial stewardship program in conjunction with rapid diagnostic tools for identifying bacteremia led to a decrease in time to optimal antimicrobial therapy and cost savings. Introduction of similar services at community hospitals with limited ASP staffing is justified. Larger studies to further investigate whether ASP partnered with rapid diagnostics have an impact on patient-related outcomes such as mortality and length of stay is warrented. Secondary outcomes Missed cost savings Disclosures All Authors: No reported disclosures


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S687-S687
Author(s):  
Philip Chung ◽  
Kate Tyner ◽  
Scott Bergman ◽  
Teresa Micheels ◽  
Mark E Rupp ◽  
...  

Abstract Background Long-term care facilities (LTCF) often struggle with implementation of antimicrobial stewardship programs (ASP) that meet all CDC core elements (CE). The CDC recommends partnership with infectious diseases (ID)/ASP experts to guide ASP implementation. The Nebraska Antimicrobial Stewardship Assessment and Promotion Program (ASAP) is an initiative funded by NE DHHS via a CDC grant to assist healthcare facilities with ASP implementation. Methods ASAP performed on-site baseline evaluation of ASP in 5 LTCF (42–293 beds) in the spring of 2017 using a 64-item questionnaire based on CDC CE. After interviewing ASP members, ASAP provided prioritized facility-specific recommendations for ASP implementation. LTCF were periodically contacted in the next 12 months to provide implementation support and evaluate progress. The number of CE met, recommendations implemented, antibiotic starts (AS) and days of therapy (DOT)/1000 resident-days (RD), and incidence of facility-onset Clostridioides difficile infections (FO-CDI) were compared 6 to 12 months before and after on-site visits. Paired t-test and Wilcoxon signed rank test were used for statistical analyses. Results Multidisciplinary ASP existed in all 5 facilities at baseline with medical directors (n = 2) or directors of nursing (n = 3) designated as team leads. Median CE implemented increased from 3 at baseline to 6 at the end of follow-up (P = 0.06). No LTCF had all 7 CE at baseline. By the end of one year, 2 facilities implemented all 7 CE with the remaining implementing 6 CE. LTCF not meeting all CE were only deficient in reporting ASP metrics to providers and staff. Among the 38 recommendations provided by ASAP, 82% were partially or fully implemented. Mean AS/1000 RD reduced by 19% from 10.1 at baseline to 8.2 post-intervention (P = 0.37) and DOT/1000 RD decreased by 21% from 91.7 to 72.5 (P = 0.20). The average incidence of FO-CDI decreased by 75% from 0.53 to 0.13 cases/10,000 RD (P = 0.25). Conclusion Assessment of LTCF ASP along with feedback for improvement by ID/ASP experts resulted in more programs meeting all 7 CE. Favorable reductions in antimicrobial use and CDI rates were also observed. Moving forward, the availability of these services should be expanded to all LTCFs struggling with ASP implementation. Disclosures All authors: No reported disclosures.


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S670-S670
Author(s):  
Hannah Ryan Russo ◽  
Kady Phe ◽  
Mayar Al Mohajer ◽  
Jessica Hirase

Abstract Background The initiation of appropriate antimicrobial therapy is dependent on timely identification of the pathogen. FilmArray Blood Culture Identification Panel (BCID) is a rapid, multiplex polymerase chain reaction (PCR) panel that identifies 24 pathogens and 3 antibiotic resistance genes associated with bloodstream infections within 1 hour of growth. The purpose of this study was to compare the clinical impact of rapid BCID testing vs. standard blood culture processing, both coupled with real-time ASP, in patients with S. aureus and Enterococcus spp. bacteremia. Methods This was a single-center, retrospective chart review conducted as a pre-post intervention quasi-experimental study. The pre-intervention group included adult patients with S.aureus and Enterococcus spp. bacteremia identified by standard blood culture processing (PRE) and the post-intervention group included those identified by rapid BCID testing (POST). The primary endpoint was time in hours from positive Gram stain to initiation of optimal antimicrobial therapy [defined as vancomycin (VAN), linezolid (LZD), daptomycin (DAP), or ceftaroline for methicillin-resistant S. aureus (MRSA); nafcillin or cefazolin for methicillin-susceptible S. aureus (MSSA); DAP or LZD for VAN-resistant Enterococcus (VRE); VAN or ampicillin (if susceptible) for VAN-susceptible Enterococcus (VSE)]. Secondary endpoints included time to active therapy (defined as an antimicrobial to which the organism was susceptible), time to identification of pathogen, length of hospital stay (LOS) after positive culture, and 30-day mortality. Results 132 patients were included. Mean time to optimal therapy decreased from 21.4 hours PRE to 10.7 hours POST (P = 0.048). Time to optimal therapy was shorter POST for MSSA [59.2 hours PRE vs. 25.8 hours POST (P < 0.001)] and VRE bacteremia [24.6 hours PRE vs. 5.6 hours POST (P = 0.005)]. Time to identification of pathogen decreased from 75.6 hours PRE to 2.7 hours POST (P < 0.001). Groups did not differ in time to active therapy, LOS, nor 30-day mortality. Conclusion Antimicrobial Stewardship coupled with rapid BCID testing significantly decreased time to pathogen identification as well as time to optimal therapy in patients with S. aureus and Enterococcus spp. bacteremia, most notably for MSSA and VRE. Disclosures All authors: No reported disclosures.


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S724-S724
Author(s):  
Courtney Pearson ◽  
Katherine Lusardi ◽  
Kelsey McCain ◽  
Jacob Painter ◽  
Mrinmayee Lakkad ◽  
...  

Abstract Background Accelerate Pheno™ blood culture detection system (AXDX) provides identification (ID) and antimicrobial susceptibility testing (AST) within 8 hours of growth in blood culture. We previously reported length of stay (LOS), time to optimal therapy (TTOT), and antibiotic days of therapy (DOT) decrease following AXDX implementation alongside an active antimicrobial stewardship program (ASP). It is unclear whether real-time notification (RTN) of results further improves these variables. Methods A single-center, quasi-experimental before/after study of adult bacteremic inpatients was performed after implementation of AXDX. A 2017 historical cohort was compared with two 2018 intervention cohorts. Intervention-1: AXDX performed 24/7 with results reviewed by providers or ASP as part of their normal workflow. Intervention 2: AXDX performed 24/7 with RTN to ASP 7 days per week 9a-5p and overnight results called to ASP at 9a. Interventions 1 and 2 were utilized on an alternating weekly basis during the study (February 2018–September 2018). Historical ID/AST were performed using VITEK® MS and VITEK®2. Exclusion criteria included polymicrobial or off-panel isolates, prior positive culture, and patients not admitted at the time of AST. Clinical outcomes were compared with Wilcoxon rank-sum and χ 2 analysis. Results 540 (83%) of 650 positive cultures performed on AXDX had on-panel organisms. 308 (57%) of these cultures and 188 (77%) of 244 reviewed historical cultures met inclusion criteria. Baseline illness severity and identified pathogens were similar between cohorts. Clinical outcomes and antimicrobial DOT are reported in Tables 1 and 2. Conclusion Following our implementation of AXDX, clinical outcomes including LOS, TTOT, total DOT, BGN DOT, and frequency of achieving optimal therapy were significantly improved compared with a historical cohort. Addition of RTN for AXDX results in the setting of an already active ASP did not further improve these metrics. However, compared with historical arm, AXDX with RTN did significantly impact specific subsets of antibiotic use while AXDX alone did not. This may be due to earlier vancomycin de-escalation. These results support the benefit of integration of AXDX into healthcare systems with an active ASP even without the resources to include real-time notification. Disclosures All authors: No reported disclosures.


2021 ◽  
Vol 8 (Supplement_1) ◽  
pp. S145-S145
Author(s):  
Madison Donnelly ◽  
Jennifer Walls ◽  
Katlyn Wood ◽  
Aiman Bandali

Abstract Background Gram-negative bacteremia is associated with significant morbidity and mortality. Development of an algorithm for antimicrobial selection, using institution-specific antibiogram data and rapid diagnostics (RDT), achieves timely and appropriate antimicrobial therapy. The objective of this study is to assess the impact of a pharmacy-driven antimicrobial stewardship initiative in conjunction with ePlex® BCID on time to optimal antimicrobial therapy for patients with gram-negative bloodstream infections. Methods This retrospective, observational, single-center study included adult patients with a documented gram-negative bloodstream infection in whom the ePlex® BCID was employed. A pharmacist-driven antimicrobial stewardship intervention was initiated on December 1, 2020; pre-intervention (December 2019 – March 2020) was compared to the post-intervention (December 2020 – February 2020) period. The following organisms were included: Citrobacter spp., Escherichia coli, Klebsiella aerogenes/pneumoniae/oxytoca, Proteus spp, Enterobacter spp., Pseudomonas aeruginosa, and Acinetobacter baumannii. Polymicrobial bloodstream infections or those who had an ePlex® panel performed prior to admission were excluded. The following clinical outcomes were assessed: time to optimal antimicrobial therapy, length of stay (LOS), and inpatient-30-day mortality. Results One hundred and sixty-three met criteria for inclusion; 98 patients in the pre-intervention group and 65 patients in the post-intervention group. The mean Pitt Bacteremia Score was 1 in both groups (p=0.741). The most common organism identified by ePlex® BCID was E. coli (65.3% vs 70.8%; p=0.676). Eight E. Coli isolates were CTX-M positive; no other gene targets were detected. The most common suspected source of bacteremia was genitourinary (72.5% vs 72.3%; p=1.0). Time to optimal therapy was reduced by 29 hours [37 (31 – 55) vs. 8 (4 – 28); p=0.048). Length of stay and mortality was similar between groups. Conclusion Implementation of a rapid blood culture identification panel along with an antimicrobial stewardship intervention significantly reduced time to optimal therapy. Further studies are warranted to confirm these results. Disclosures All Authors: No reported disclosures


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S115-S116
Author(s):  
Jessica Gerges ◽  
Karan Raja ◽  
Mitesh Patel ◽  
Ruben Patel ◽  
Brandon Chen ◽  
...  

Abstract Background Rapid diagnostic tests (RDT) can identify pathogens in bloodstream infections (BSI) in less than 24 hours. Our institution utilizes an RDT for blood cultures (BCx) that can detect various organisms and resistance determinants. A retrospective evaluation conducted in our institution calculated the negative predictive values (NPV) of various Gram-negative pathogens and susceptibility to target antimicrobials in the absence of detected resistance markers. Resultant NPV >90% for E. coli and K. pneumoniae to ceftriaxone support use of RDT with stewardship intervention for more rapid de-escalation of antimicrobial therapy in patients with resistance marker-negative BSI. Methods In our facility, all positive BCx are processed through RDT. In the post-intervention group, pharmacists monitored RDT results and provided recommendations. Our IRB-approved, prospective study assessed time to antimicrobial de-escalation in treatment of resistance marker-negative E. coli and K. pneumoniae BSI before (January 1 to December 31, 2018) and after Stewardship intervention (January 1 to March 31, 2019). Secondary outcomes included days of therapy (DOT) of target narrow-spectrum β-lactams, carbapenems, and non-carbapenem anti-pseudomonal (NCAP) β-lactams, length of stay (LOS), and treatment failure. Data were analyzed using the Fisher exact or Chi-square and t-test for categorical and continuous data, respectively. Results Of the 12,893 evaluated RDT results in the pre-intervention group and 2,238 post intervention, 41 and 12 patients met inclusion criteria, respectively. Baseline characteristics were similar in both groups. Time to de-escalation to a target agent was decreased by 24 hours after stewardship intervention (50 v 74.6 hours) (P = 0.14). There were no statistically significant differences in DOTs for target agents (5.19 v 5.25 DOT; P = 0.48), carbapenems (1.29 v 1.08 DOT; P = 0.41), or NCAP β-lactams (1.73 v 2.33; P = 0.25). Treatment failure (2 in each group; P = 0.17) and LOS (10.9 v 11.9 days; P = 0.4) were similar between groups. Protocol compliance and intervention acceptance rate was approximately 60%. Conclusion Appreciation of NPVs and utilization of stewardship intervention allowed for early de-escalation of empiric therapy in patients with resistance marker-negative E. coli and K. pneumoniae bacteremia. Disclosures All authors: No reported disclosures.


2019 ◽  
Vol 6 (10) ◽  
Author(s):  
Natasha N Pettit ◽  
Zhe Han ◽  
Cynthia T Nguyen ◽  
Anish Choksi ◽  
Angella Charnot-Katsikas ◽  
...  

Abstract Background Antimicrobial stewardship interventions utilizing real-time alerting through the electronic medical record enable timely implementation of the bundle of care (BOC) for patients with severe infections, such as candidemia. Automated alerting for candidemia using the Epic stewardship module has been in place since July 2015 at our medical center. We sought to assess the impact of these alerts. Methods All adult inpatients with candidemia between April 1, 2011, and March 31, 2012 (pre-intervention), and June 30, 2016, and July 1, 2017 (post-intervention), were evaluated for BOC adherence. We also evaluated the impact on timeliness to initiate targeted therapy, length of stay (LOS), and 30-day mortality. Results Eighty-four patients were included, 42 in the pre- and 42 in the post-intervention group. Adherence to BOC was significantly improved, from 48% (pre-intervention) to 83% (post-intervention; P = .001). The median time to initiation of therapy was 4.8 hours vs 3.3 hours (P = .58), the median LOS was 24 and 18 days (P = .28), and 30-day mortality was 19% and 26% (P = .60) in the pre- and post-intervention groups, respectively. Conclusions Antimicrobial stewardship program review of automated alerts identifying patients with candidemia resulted in significantly improved BOC adherence and was associated with a 1.5-hour reduction in time to initiation of antifungal therapy. No significant change was observed with 30-day mortality or LOS.


Sign in / Sign up

Export Citation Format

Share Document