scholarly journals Implementation of a Rapid Phenotypic Susceptibility Platform for Gram-Negative Bloodstream Infections With Paired Antimicrobial Stewardship Intervention: Is the Juice Worth the Squeeze?

Author(s):  
Evan D Robinson ◽  
Allison M Stilwell ◽  
April E Attai ◽  
Lindsay E Donohue ◽  
Megan D Shah ◽  
...  

Abstract Background Implementation of the Accelerate PhenoTM Gram-negative platform (RDT) paired with antimicrobial stewardship program (ASP) intervention projects to improve time to institutional-preferred antimicrobial therapy (IPT) for Gram-negative bacilli (GNB) bloodstream infections (BSIs). However, few data describe the impact of discrepant RDT results from standard of care (SOC) methods on antimicrobial prescribing. Methods A single-center, pre-/post-intervention study of consecutive, nonduplicate blood cultures for adult inpatients with GNB BSI following combined RDT + ASP intervention was performed. The primary outcome was time to IPT. An a priori definition of IPT was utilized to limit bias and to allow for an assessment of the impact of discrepant RDT results with the SOC reference standard. Results Five hundred fourteen patients (PRE 264; POST 250) were included. Median time to antimicrobial susceptibility testing (AST) results decreased 29.4 hours (P < .001) post-intervention, and median time to IPT was reduced by 21.2 hours (P < .001). Utilization (days of therapy [DOTs]/1000 days present) of broad-spectrum agents decreased (PRE 655.2 vs POST 585.8; P = .043) and narrow-spectrum beta-lactams increased (69.1 vs 141.7; P < .001). Discrepant results occurred in 69/250 (28%) post-intervention episodes, resulting in incorrect ASP recommendations in 10/69 (14%). No differences in clinical outcomes were observed. Conclusions While implementation of a phenotypic RDT + ASP can improve time to IPT, close coordination with Clinical Microbiology and continued ASP follow up are needed to optimize therapy. Although uncommon, the potential for erroneous ASP recommendations to de-escalate to inactive therapy following RDT results warrants further investigation.

2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S61-S61
Author(s):  
Evan D Robinson ◽  
Heather L Cox ◽  
April E Attai ◽  
Lindsay Donohue ◽  
Megan Shah ◽  
...  

Abstract Background Implementation of the Accelerate PhenoTM Gram-negative platform (AXDX) paired with ASP intervention projects to improve time to definitive institutional-preferred antimicrobial therapy (IPT). However, few data describe the impact of discrepant RDT results from standard of care (SOC) methods on antimicrobial prescribing. Here we evaluate the prescribing outcomes for discrepant results following the first year of AXDX + ASP implementation. Methods Consecutive, non-duplicate blood cultures for adult inpatients with GNB BSI following combined RDT + ASP intervention were included (July 2018 – July 2019). AXDX results were emailed to the ASP in real time then released into the EMR upon ASP review and communication with the treating team. SOC identification (ID; Vitek® MS/Vitek® 2) and antimicrobial susceptibility testing (AST; Trek SensititreTM) followed RDT as the reference standard. IPT was defined as the narrowest susceptible beta-lactam, and a discrepancy was characterized when there was categorical disagreement between RDT and SOC methods. When IPT by AXDX was found to be non-susceptible on SOC, this was characterized as “false susceptible“. Conversely, “false resistance” was assessed when a narrower-spectrum agent was susceptible by SOC. Results were also deemed discrepant when the AXDX provided no/incorrect ID for on-panel organisms, no AST, or a polymicrobial specimen was missed. Results Sixty-nine of 250 patients (28%) had a discrepancy in organism ID or AST: false resistance (9%), false susceptible (5%), no AST (5%), no ID (4%), incorrect ID (2%), and missed polymicrobial (2%). A prescribing impact occurred in 55% of cases (Table 1), where unnecessarily broad therapy was continued most often. Erroneous escalation (7%) and de-escalation to inactive therapy (7%) occurred less frequently. In-hospital mortality occurred in 4 cases, none of which followed an inappropriate transition to inactive therapy. Conclusion Though the AXDX platform provides rapid ID and AST results, close coordination with Clinical Microbiology and continued ASP follow up are needed to optimize therapy. Although uncommon, the potential for erroneous ASP recommendations to de-escalate to inactive therapy following AXDX results warrants further investigation. Disclosures Amy J. Mathers, MD, D(ABMM), Accelerate Diagnostics (Consultant)


2019 ◽  
Vol 19 (1) ◽  
Author(s):  
Gerald Elliott ◽  
Michael Malczynski ◽  
Viktorjia O. Barr ◽  
Doaa Aljefri ◽  
David Martin ◽  
...  

Abstract Background Initiating early effective antimicrobial therapy is the most important intervention demonstrated to decrease mortality in patients with gram-negative bacteremia with sepsis. Rapid MIC-based susceptibility results make it possible to optimize antimicrobial use through both escalation and de-escalation. Method We prospectively evaluated the performance of the Accelerate Pheno™ system (AXDX) for identification and susceptibility testing of gram-negative species and compared the time to result between AXDX and routine standard of care (SOC) using 82 patient samples and 18 challenge organisms with various confirmed resistance mechanisms. The potential impact of AXDX on time to antimicrobial optimization was investigated with various simulated antimicrobial stewardship (ASTEW) intervention models. Results The overall positive and negative percent agreement of AXDX for identification were 100 and 99.9%, respectively. Compared to VITEK® 2, the overall essential agreement was 96.1% and categorical agreement was 95.4%. No very major or major errors were detected. AXDX reduced the time to identification by an average of 11.8 h and time to susceptibility by an average of 36.7 h. In 27 patients evaluated for potential clinical impact of AXDX on antimicrobial optimization, 18 (67%) patients could potentially have had therapy optimized sooner with an average of 18.1 h reduction in time to optimal therapy. Conclusion Utilization of AXDX coupled with simulated ASTEW intervention notification substantially shortened the time to potential antimicrobial optimization in this cohort of patients with gram-negative bacteremia. This improvement in time occurred when ASTEW support was limited to an 8-h coverage model.


2021 ◽  
pp. 089719002110006
Author(s):  
Jordan M. Chiasson ◽  
Winter J. Smith ◽  
Tomasz Z. Jodlowski ◽  
Marcus A. Kouma ◽  
James B. Cutrell

Purpose: Utilization of rapid diagnostic testing alongside intensive antimicrobial stewardship interventions improves patient outcomes. We sought to determine the clinical impact of a rapid blood culture identification (BCID) panel in an established Antimicrobial Stewardship Program (ASP) with limited personnel resources. Methods: A single center retrospective pre- and post-intervention cohort study was performed following the implementation of a BCID panel on patients admitted with at least 1 positive blood culture during the study period. The primary outcome was time to optimal therapy from blood culture collection. Secondary outcomes included days of therapy (DOT), length of stay, and 30-day mortality and readmission rates. Results: 277 patients were screened with 180 patients included, with 82 patients in the pre-BCID and 98 in the post-BCID arms. Median time to optimal therapy was 73.8 hours (IQR; 1.1-79.6) in the pre-BCID arm and 34.7 hours (IQR; 10.9-71.6) in the post-BCID arm (p ≤ 0.001). Median DOT for vancomycin was 4 and 3 days (p ≤ 0.001), and for piperacillin-tazobactam was 3.5 and 2 days (p ≤ 0.007), for the pre-BCID and post-BCID arms, respectively. Median length of hospitalization was decreased from 11 to 9 days (p = 0.031). No significant change in 30-day readmission rate was noted, with a trend toward lower mortality (12% vs 5%; p = 0.086). Conclusion: Introduction of BCID into the daily workflow resulted in a significant reduction in time to optimal therapy for bloodstream infections and DOT for select broad-spectrum antibiotics, highlighting the potential benefits of rapid diagnostics even in settings with limited personnel resources.


2021 ◽  
Vol 8 (Supplement_1) ◽  
pp. S145-S145
Author(s):  
Madison Donnelly ◽  
Jennifer Walls ◽  
Katlyn Wood ◽  
Aiman Bandali

Abstract Background Gram-negative bacteremia is associated with significant morbidity and mortality. Development of an algorithm for antimicrobial selection, using institution-specific antibiogram data and rapid diagnostics (RDT), achieves timely and appropriate antimicrobial therapy. The objective of this study is to assess the impact of a pharmacy-driven antimicrobial stewardship initiative in conjunction with ePlex® BCID on time to optimal antimicrobial therapy for patients with gram-negative bloodstream infections. Methods This retrospective, observational, single-center study included adult patients with a documented gram-negative bloodstream infection in whom the ePlex® BCID was employed. A pharmacist-driven antimicrobial stewardship intervention was initiated on December 1, 2020; pre-intervention (December 2019 – March 2020) was compared to the post-intervention (December 2020 – February 2020) period. The following organisms were included: Citrobacter spp., Escherichia coli, Klebsiella aerogenes/pneumoniae/oxytoca, Proteus spp, Enterobacter spp., Pseudomonas aeruginosa, and Acinetobacter baumannii. Polymicrobial bloodstream infections or those who had an ePlex® panel performed prior to admission were excluded. The following clinical outcomes were assessed: time to optimal antimicrobial therapy, length of stay (LOS), and inpatient-30-day mortality. Results One hundred and sixty-three met criteria for inclusion; 98 patients in the pre-intervention group and 65 patients in the post-intervention group. The mean Pitt Bacteremia Score was 1 in both groups (p=0.741). The most common organism identified by ePlex® BCID was E. coli (65.3% vs 70.8%; p=0.676). Eight E. Coli isolates were CTX-M positive; no other gene targets were detected. The most common suspected source of bacteremia was genitourinary (72.5% vs 72.3%; p=1.0). Time to optimal therapy was reduced by 29 hours [37 (31 – 55) vs. 8 (4 – 28); p=0.048). Length of stay and mortality was similar between groups. Conclusion Implementation of a rapid blood culture identification panel along with an antimicrobial stewardship intervention significantly reduced time to optimal therapy. Further studies are warranted to confirm these results. Disclosures All Authors: No reported disclosures


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S704-S704
Author(s):  
David Ha ◽  
Mary Bette Forte ◽  
Victoria Broberg ◽  
Rita Olans ◽  
Richard Olans ◽  
...  

Abstract Background Minimal literature exists to demonstrate the quantitative impact of bedside nurses in antimicrobial stewardship (AMS). We initiated bedside nurse-driven AMS and infection prevention (AMS/IP) rounds on three inpatient telemetry units of a community regional medical center. Rounds were nurse-driven, involved an infectious diseases (ID) pharmacist and infection preventionist, and were designed to complement traditional ID pharmacist and ID physician AMS rounds. Rounds were focused on use of antibiotics, urinary catheters (UCs), and central venous catheters (CVCs). Recommendations from rounds were communicated by the bedside nurse either directly to providers or to the ID pharmacist and ID physician for intervention. Methods This was an observational, multiple-group, quasi-experimental study conducted over 3.5 years (July 2015 to December 2018) to characterize the impact of bedside nurse-driven AMS/IP rounds on antibiotic, urinary catheter and CVC use, hospital-onset C. difficile infection (CDI), catheter-associated urinary tract infections (CAUTI), and central line-associated bloodstream infections (CLABSI). Outcomes were assessed in two cohorts based on time of AMS/IP rounds implementation (Cohort 1 implemented on one telemetry unit in July 2016, Cohort 2 implemented in two telemetry units in January 2018). Results A total of 2,273 patient therapy reviews occurred (Cohort 1: 1,736; Cohort 2: 537). Of these reviews, 1,209 (53%) were antibiotics, 879 (39%) were urinary catheters, and 185 (8%) were CVCs. Pre- vs. post-intervention, significant reductions were observed in both cohorts for mean monthly antibiotic days of therapy per 1,000 patient-days (Cohort 1: 791 vs. 688, P < 0.001; Cohort 2: 615 vs. 492, P < 0.001), UC days per patient day (Cohort 1: 0.25 vs. 0.16, P < 0.001; Cohort 2: 0.19 vs. 0.14, P < 0.001), CVC days per patient day (Cohort 1: 0.15 vs. 0.11, = 0.002; Cohort 2: 0.09 vs. 0.07, p = 0.005), and CDI per 10,000 patient-days (Cohort 1: 17.8 vs. 7.1, p = 0.035; Cohort 2: 19.1 vs. 5.4, p = 0.003). Numerical reductions were observed in CAUTI and CLABSI per 10,000 patient-days. Conclusion Bedside nurses can improve AMS and IP outcomes in a scalable fashion when supported by an interdisciplinary AMS/IP team and are complimentary to traditional AMS and IP practices. Disclosures All authors: No reported disclosures.


2017 ◽  
Vol 61 (9) ◽  
Author(s):  
P. B. Bookstaver ◽  
E. B. Nimmich ◽  
T. J. Smith ◽  
J. A. Justo ◽  
J. Kohn ◽  
...  

ABSTRACT The use of rapid diagnostic tests (RDTs) enhances antimicrobial stewardship program (ASP) interventions in optimization of antimicrobial therapy. This quasi-experimental cohort study evaluated the combined impact of an ASP/RDT bundle on the appropriateness of empirical antimicrobial therapy (EAT) and time to de-escalation of broad-spectrum antimicrobial agents (BSAA) in Gram-negative bloodstream infections (GNBSI). The ASP/RDT bundle consisted of system-wide GNBSI treatment guidelines, prospective stewardship monitoring, and sequential introduction of two RDTs, matrix-assisted laser desorption ionization–time of flight mass spectrometry (MALDI-TOF MS) and the FilmArray blood culture identification (BCID) panel. The preintervention period was January 2010 through December 2013, and the postintervention period followed from January 2014 through June 2015. The postintervention period was conducted in two phases; phase 1 followed the introduction of MALDI-TOF MS, and phase 2 followed the introduction of the FilmArray BCID panel. The interventions resulted in significantly improved appropriateness of EAT (95% versus 91%; P = 0.02). Significant reductions in median time to de-escalation from combination antimicrobial therapy (2.8 versus 1.5 days), antipseudomonal beta-lactams (4.0 versus 2.5 days), and carbapenems (4.0 versus 2.5 days) were observed in the postintervention compared to the preintervention period (P < 0.001 for all). The reduction in median time to de-escalation from combination therapy (1.0 versus 2.0 days; P = 0.03) and antipseudomonal beta-lactams (2.2 versus 2.7 days; P = 0.04) was further augmented during phase 2 compared to phase 1 of the postintervention period. Implementation of an antimicrobial stewardship program and RDT intervention bundle in a multihospital health care system is associated with improved appropriateness of EAT for GNBSI and decreased utilization of BSAA through early de-escalation.


2012 ◽  
Vol 33 (5) ◽  
pp. 500-506 ◽  
Author(s):  
Andrew M. Morris ◽  
Stacey Brener ◽  
Linda Dresser ◽  
Nick Daneman ◽  
Timothy H. Dellit ◽  
...  

Introduction.Antimicrobial stewardship programs are being implemented in health care to reduce inappropriate antimicrobial use, adverse events, Clostridium difficile infection, and antimicrobial resistance. There is no standardized approach to evaluate the impact of these programs.Objective.To use a structured panel process to define quality improvement metrics for evaluating antimicrobial stewardship programs in hospital settings that also have the potential to be used as part of public reporting efforts.Design.A multiphase modified Delphi technique.Setting.Paper-based survey supplemented with a 1-day consensus meeting.Participants.A 10-member expert panel from Canada and the United States was assembled to evaluate indicators for relevance, effectiveness, and the potential to aid quality improvement efforts.Results.There were a total of 5 final metrics selected by the panel: (1) days of therapy per 1000 patient-days; (2) number of patients with specific organisms that are drug resistant; (3) mortality related to antimicrobial-resistant organisms; (4) conservable days of therapy among patients with community-acquired pneumonia (CAP), skin and soft-tissue infections (SSTI), or sepsis and bloodstream infections (BSI); and (5) unplanned hospital readmission within 30 days after discharge from the hospital in which the most responsible diagnosis was one of CAP, SSTI, sepsis or BSI. The first and second indicators were also identified as useful for accountability purposes, such as public reporting.Conclusion.We have successfully identified 2 measures for public reporting purposes and 5 measures that can be used internally in healthcare settings as quality indicators. These indicators can be implemented across diverse healthcare systems to enable ongoing evaluation of antimicrobial stewardship programs and complement efforts for improved patient safety.


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S687-S687
Author(s):  
Philip Chung ◽  
Kate Tyner ◽  
Scott Bergman ◽  
Teresa Micheels ◽  
Mark E Rupp ◽  
...  

Abstract Background Long-term care facilities (LTCF) often struggle with implementation of antimicrobial stewardship programs (ASP) that meet all CDC core elements (CE). The CDC recommends partnership with infectious diseases (ID)/ASP experts to guide ASP implementation. The Nebraska Antimicrobial Stewardship Assessment and Promotion Program (ASAP) is an initiative funded by NE DHHS via a CDC grant to assist healthcare facilities with ASP implementation. Methods ASAP performed on-site baseline evaluation of ASP in 5 LTCF (42–293 beds) in the spring of 2017 using a 64-item questionnaire based on CDC CE. After interviewing ASP members, ASAP provided prioritized facility-specific recommendations for ASP implementation. LTCF were periodically contacted in the next 12 months to provide implementation support and evaluate progress. The number of CE met, recommendations implemented, antibiotic starts (AS) and days of therapy (DOT)/1000 resident-days (RD), and incidence of facility-onset Clostridioides difficile infections (FO-CDI) were compared 6 to 12 months before and after on-site visits. Paired t-test and Wilcoxon signed rank test were used for statistical analyses. Results Multidisciplinary ASP existed in all 5 facilities at baseline with medical directors (n = 2) or directors of nursing (n = 3) designated as team leads. Median CE implemented increased from 3 at baseline to 6 at the end of follow-up (P = 0.06). No LTCF had all 7 CE at baseline. By the end of one year, 2 facilities implemented all 7 CE with the remaining implementing 6 CE. LTCF not meeting all CE were only deficient in reporting ASP metrics to providers and staff. Among the 38 recommendations provided by ASAP, 82% were partially or fully implemented. Mean AS/1000 RD reduced by 19% from 10.1 at baseline to 8.2 post-intervention (P = 0.37) and DOT/1000 RD decreased by 21% from 91.7 to 72.5 (P = 0.20). The average incidence of FO-CDI decreased by 75% from 0.53 to 0.13 cases/10,000 RD (P = 0.25). Conclusion Assessment of LTCF ASP along with feedback for improvement by ID/ASP experts resulted in more programs meeting all 7 CE. Favorable reductions in antimicrobial use and CDI rates were also observed. Moving forward, the availability of these services should be expanded to all LTCFs struggling with ASP implementation. Disclosures All authors: No reported disclosures.


2017 ◽  
Vol 4 (suppl_1) ◽  
pp. S63-S63
Author(s):  
Fabian Andres Romero ◽  
Evette Mathews ◽  
Ara Flores ◽  
Susan Seo

Abstract Background Antibiotic stewardship program (ASP) implementation is paramount across the healthcare spectrum. Nursing homes represent a challenge due to limited resources, complexity of medical conditions, and less controlled environments. National statistics on ASP for long-term care facilities (LTCF) are sparse. Methods A pilot ASP was launched in August 2016 at a 270-bed nursing home with a 50-bed chronic ventilator-dependent unit. The program entailed a bundle of interventions including leadership engagement, a tracking and reporting system for intravenous antibiotics, education for caregivers, Infectious Disease (ID) consultant availability, and implementation of nursing protocols. Data were collected from pharmacy and medical records between January 2016 and March 2017, establishing pre-intervention and post-intervention periods. Collected data included days of therapy (DOT), antibiotic costs, resident-days, hospital transfers, and Clostridium difficile infection (CDI) rates. Variables were adjusted to 1,000 resident-days (RD) and findings between periods were compared by Mann–Whitney U test. Results A total of 47,423 resident-days and 1,959 DOT were analyzed for this study. Antibiotic use decreased from 54.5 DOT/1000 RD pre-intervention to 27.6 DOT/1000 RD post-intervention (P = 0.017). Antibiotic costs were reduced from a monthly median of US $17,113 to US $7,073 but was not statistically significant (P = 0.39). Analysis stratified by individual antibiotic was done for the five most commonly used antibiotics and found statistically significant reduction in vancomycin use (14.4 vs. 6.5; P = 0.023). Reduction was also found for cefepime/ceftazidime (6.9 vs. 1.3; P = 0.07), ertapenem (6.8 vs. 3.6; P = 0.45), and piperacillin/tazobactam (1.8 vs. 0.6; P = 0.38). Meropenem use increased (1.3 vs. 3.2; P = 0.042). Hospital transfers slightly trended up (6.73 vs. 7.77; P = 0.065), and there was no change in CDI (1.1 s 0.94; P = 0.32). Conclusion A bundle of standardized interventions tailored for LTCF can achieve successful reduction of antibiotic utilization and costs. Subsequent studies are needed to further determine the impact on clinical outcomes such as transfers to hospitals and CDI in these settings. Disclosures All authors: No reported disclosures.


Sign in / Sign up

Export Citation Format

Share Document