scholarly journals Cumulative Effect of an Antimicrobial Stewardship and Rapid Diagnostic Testing Bundle on Early Streamlining of Antimicrobial Therapy in Gram-Negative Bloodstream Infections

2017 ◽  
Vol 61 (9) ◽  
Author(s):  
P. B. Bookstaver ◽  
E. B. Nimmich ◽  
T. J. Smith ◽  
J. A. Justo ◽  
J. Kohn ◽  
...  

ABSTRACT The use of rapid diagnostic tests (RDTs) enhances antimicrobial stewardship program (ASP) interventions in optimization of antimicrobial therapy. This quasi-experimental cohort study evaluated the combined impact of an ASP/RDT bundle on the appropriateness of empirical antimicrobial therapy (EAT) and time to de-escalation of broad-spectrum antimicrobial agents (BSAA) in Gram-negative bloodstream infections (GNBSI). The ASP/RDT bundle consisted of system-wide GNBSI treatment guidelines, prospective stewardship monitoring, and sequential introduction of two RDTs, matrix-assisted laser desorption ionization–time of flight mass spectrometry (MALDI-TOF MS) and the FilmArray blood culture identification (BCID) panel. The preintervention period was January 2010 through December 2013, and the postintervention period followed from January 2014 through June 2015. The postintervention period was conducted in two phases; phase 1 followed the introduction of MALDI-TOF MS, and phase 2 followed the introduction of the FilmArray BCID panel. The interventions resulted in significantly improved appropriateness of EAT (95% versus 91%; P = 0.02). Significant reductions in median time to de-escalation from combination antimicrobial therapy (2.8 versus 1.5 days), antipseudomonal beta-lactams (4.0 versus 2.5 days), and carbapenems (4.0 versus 2.5 days) were observed in the postintervention compared to the preintervention period (P < 0.001 for all). The reduction in median time to de-escalation from combination therapy (1.0 versus 2.0 days; P = 0.03) and antipseudomonal beta-lactams (2.2 versus 2.7 days; P = 0.04) was further augmented during phase 2 compared to phase 1 of the postintervention period. Implementation of an antimicrobial stewardship program and RDT intervention bundle in a multihospital health care system is associated with improved appropriateness of EAT for GNBSI and decreased utilization of BSAA through early de-escalation.

2019 ◽  
Vol 6 (10) ◽  
Author(s):  
Tsubasa Akazawa ◽  
Yoshiki Kusama ◽  
Haruhisa Fukuda ◽  
Kayoko Hayakawa ◽  
Satoshi Kutsuna ◽  
...  

Abstract Objective We implemented a stepwise antimicrobial stewardship program (ASP). This study evaluated the effect of each intervention and the overall economic impact on carbapenem (CAR) use. Method Carbapenem days of therapy (CAR-DOT) were calculated to assess the effect of each intervention, and antipseudomonal DOT were calculated to assess changes in use of broad-spectrum antibiotics. We carried out segmented regression analysis of studies with interrupted time series for 3 periods: Phase 1 (infectious disease [ID] consultation service only), Phase 2 (adding monitoring and e-mail feedback), and Phase 3 (adding postprescription review and feedback [PPRF] led by ID specialist doctors and pharmacists). We also estimated cost savings over the study period due to decreased CAR use. Results The median monthly CAR-DOT, per month per 100 patient-days, during Phase 1, Phase 2, and Phase 3 was 5.46, 3.69, and 2.78, respectively. The CAR-DOT decreased significantly immediately after the start of Phase 2, but a major decrease was not observed during this period. Although the immediate change was not apparent after Phase 3 started, CAR-DOT decreased significantly over this period. Furthermore, the monthly DOT of 3 alternative antipseudomonal agents also decreased significantly over the study period, but the incidence of antimicrobial resistance did not decrease. Cost savings over the study period, due to decreased CAR use, was estimated to be US $150 000. Conclusions Adding PPRF on the conventional ASP may accelerate antimicrobial stewardship. Our CAR stewardship program has had positive results, and implementation is ongoing.


Author(s):  
Evan D Robinson ◽  
Allison M Stilwell ◽  
April E Attai ◽  
Lindsay E Donohue ◽  
Megan D Shah ◽  
...  

Abstract Background Implementation of the Accelerate PhenoTM Gram-negative platform (RDT) paired with antimicrobial stewardship program (ASP) intervention projects to improve time to institutional-preferred antimicrobial therapy (IPT) for Gram-negative bacilli (GNB) bloodstream infections (BSIs). However, few data describe the impact of discrepant RDT results from standard of care (SOC) methods on antimicrobial prescribing. Methods A single-center, pre-/post-intervention study of consecutive, nonduplicate blood cultures for adult inpatients with GNB BSI following combined RDT + ASP intervention was performed. The primary outcome was time to IPT. An a priori definition of IPT was utilized to limit bias and to allow for an assessment of the impact of discrepant RDT results with the SOC reference standard. Results Five hundred fourteen patients (PRE 264; POST 250) were included. Median time to antimicrobial susceptibility testing (AST) results decreased 29.4 hours (P &lt; .001) post-intervention, and median time to IPT was reduced by 21.2 hours (P &lt; .001). Utilization (days of therapy [DOTs]/1000 days present) of broad-spectrum agents decreased (PRE 655.2 vs POST 585.8; P = .043) and narrow-spectrum beta-lactams increased (69.1 vs 141.7; P &lt; .001). Discrepant results occurred in 69/250 (28%) post-intervention episodes, resulting in incorrect ASP recommendations in 10/69 (14%). No differences in clinical outcomes were observed. Conclusions While implementation of a phenotypic RDT + ASP can improve time to IPT, close coordination with Clinical Microbiology and continued ASP follow up are needed to optimize therapy. Although uncommon, the potential for erroneous ASP recommendations to de-escalate to inactive therapy following RDT results warrants further investigation.


2021 ◽  
Vol 8 (Supplement_1) ◽  
pp. S150-S150
Author(s):  
Carlos M Nunez ◽  
Arun Mattappallil ◽  
Katie A McCrink ◽  
Debbie Rybak ◽  
Basil Taha ◽  
...  

Abstract Background Fluoroquinolone (FQ) antibiotics are frequently used in hospitalized patients to treat a wide range of infections but are often misused and implicated in antibiotic-associated adverse events. The purpose of this study is to evaluate the impact of Infectious Disease fellow (IDF)-driven antimicrobial stewardship program (ASP) interventions on inpatient FQ use. Methods This is a retrospective study of all admitted patients who received a FQ for greater than 48 hours from 01/01/2019 -12/31/2020 in an urban academic center. “Phase 1” (pre-intervention phase) covered 01/1/2019- 03/31/2019. “Phase 2” (intervention phase) covered 03/03/2020- 12/23/2020. In “Phase 2”, our ASP reviewed FQ use 2-3 days per week and an IDF provided feedback interventions that averaged 30-60 minutes of IDF time spent per day. We categorized FQ use as either: “appropriate”, “appropriate but not preferred”, or “inappropriate”, as determined by local clinical guidelines and ASP team opinion. We compared FQ use in both phases, indications for FQ use, and new Clostridioides difficile infections (CDI). Results A total of 386 patients are included (76 in “Phase 1”and 310 in “Phase 2”). Patient characteristics are similar (Table 1). Overall, 63 % of FQ use was empiric, and 50% FQ use was deemed “appropriate”, 28% “appropriate but not preferred”, and 22% “inappropriate”. In “Phase 2”, 126 interventions were conducted, with 86% of these accepted. Appropriate FQ use increased significantly in “Phase 2” vs. “Phase 1” (53.5% vs 35.5%, p = 0.008), with decrease in mean days of FQ use (4.38 days vs 5.87 days, p =.021). Table 2 shows “appropriate” FQ use by clinical indication. New CDIs occurred more in “Phase 1” vs. “Phase 2” (6.6% vs 0.6%, p=.001). Conclusion An IDF-driven ASP intervention has a positive impact on appropriate inpatient use of FQs in our hospital. This highlights a promising ASP model which not only improves appropriate use of FQ, but also offers an opportunity for IDF mentorship and use of available resources to promote ASPs. Disclosures Katie A. McCrink, PharmD, ViiV Healthcare (Employee)


2015 ◽  
Vol 2 (1) ◽  
Author(s):  
Neil M. Vora ◽  
Christine J. Kubin ◽  
E. Yoko Furuya

Abstract Background.  Practicing antimicrobial stewardship in the setting of widespread antimicrobial resistance among gram-negative bacilli, particularly in urban areas, is challenging. Methods.  We conducted a retrospective cross-sectional study at a tertiary care hospital with an established antimicrobial stewardship program in New York, New York to determine appropriateness of use of gram-negative antimicrobials and to identify factors associated with suboptimal antimicrobial use. Adult inpatients who received gram-negative agents on 2 dates, 1 June 2010 or 1 December 2010, were identified through pharmacy records. Clinical data were collected for each patient. Use of gram-negative agents was deemed optimal or suboptimal through chart review and according to hospital guidelines. Data were compared using χ2 or Fischer's exact test for categorical variables and Student t test or Mann–Whitney U test for continuous variables. Results.  A total of 356 patients were included who received 422 gram-negative agents. Administration was deemed suboptimal in 26% of instances, with the most common reason being spectrum of activity too broad. In multivariable analysis, being in an intensive care unit (adjusted odds ratio [aOR], .49; 95% confidence interval [CI], .29–.84), having an infectious diseases consultation within the previous 7 days (aOR, .52; 95% CI, .28–.98), and having a history of multidrug-resistant gram-negative bacilli within the past year (aOR, .24; 95% CI, .09–.65) were associated with optimal gram-negative agent use. Beta-lactam/beta-lactamase inhibitor combination drug use (aOR, 2.6; 95% CI, 1.35–5.16) was associated with suboptimal use. Conclusions.  Gram-negative agents were used too broadly despite numerous antimicrobial stewardship program activities.


2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S50-S50
Author(s):  
Cristen A Whittaker ◽  
Ethan Nhan ◽  
Marc Storb ◽  
Shana Szymborski ◽  
Manish Trivedi ◽  
...  

Abstract Background Antimicrobial stewardship is a priority for hospitals and utilizing generated reports can enhance stewardship activities. At our institution, a software program was used to help optimize antimicrobial therapy by providing a drug-bug mismatch (DBM) alert which identifies patients with culture susceptibilities not covered by their current antimicrobial therapy. The purpose of this study was to evaluate the utility of this alert feature and determine whether or not an intervention was needed for patients identified. Methods From August 2019 to March 2020 the DBM alerts were reviewed by a pharmacist and interventions pursued when appropriate. Data collection included the patient’s culture results and source, indication for current antibiotics, and potential for intervention. Alerts were stratified into different groups based on the type of culture, including urine, blood, sputum, bone or bodily fluid, wound or tissues, and stool. Those mismatches not resulting in an intervention were categorized as a contamination, colonization, or inappropriate. This study was approved by the institutional review board. Results A total of 105 DBM alerts were analyzed from various sources, including 51 (47.6%) urine, 17 (16.2%) sputum, 16 (15.2%) wound or tissue, 14 (13.3%) blood, 6 (5.7%) bone or bodily fluid, and 1 stool culture. Overall, 48 of 105 (45.7%) of alerts resulted in an intervention. Urine and sputum culture alerts required interventions at the lowest rate with treatment interventions in 12 of 51 (23.5%) and 5 of 17 (29.4%) of those cases respectively. Blood culture alerts were the most successful as 9 of 14 (64.3%) alerts required an intervention. Alerts with wound or tissue cultures identified gaps in therapy as 9 of 16 (56.3%) cases required intervention. Colonization or contamination appeared to be the major cause of alerts that did not result in intervention. Conclusion The DBM alert can be a beneficial tool for pharmacists participating in antimicrobial stewardship activities. However, the alerts had varying value depending on the culture source. The DBM alert can identify real-time patient issues regarding appropriate antimicrobial therapy. Further modifications to our process in utilizing this DBM report are warranted to enhance value and allocate time accordingly. Disclosures All Authors: No reported disclosures


2021 ◽  
Vol 8 (Supplement_1) ◽  
pp. S145-S145
Author(s):  
Madison Donnelly ◽  
Jennifer Walls ◽  
Katlyn Wood ◽  
Aiman Bandali

Abstract Background Gram-negative bacteremia is associated with significant morbidity and mortality. Development of an algorithm for antimicrobial selection, using institution-specific antibiogram data and rapid diagnostics (RDT), achieves timely and appropriate antimicrobial therapy. The objective of this study is to assess the impact of a pharmacy-driven antimicrobial stewardship initiative in conjunction with ePlex® BCID on time to optimal antimicrobial therapy for patients with gram-negative bloodstream infections. Methods This retrospective, observational, single-center study included adult patients with a documented gram-negative bloodstream infection in whom the ePlex® BCID was employed. A pharmacist-driven antimicrobial stewardship intervention was initiated on December 1, 2020; pre-intervention (December 2019 – March 2020) was compared to the post-intervention (December 2020 – February 2020) period. The following organisms were included: Citrobacter spp., Escherichia coli, Klebsiella aerogenes/pneumoniae/oxytoca, Proteus spp, Enterobacter spp., Pseudomonas aeruginosa, and Acinetobacter baumannii. Polymicrobial bloodstream infections or those who had an ePlex® panel performed prior to admission were excluded. The following clinical outcomes were assessed: time to optimal antimicrobial therapy, length of stay (LOS), and inpatient-30-day mortality. Results One hundred and sixty-three met criteria for inclusion; 98 patients in the pre-intervention group and 65 patients in the post-intervention group. The mean Pitt Bacteremia Score was 1 in both groups (p=0.741). The most common organism identified by ePlex® BCID was E. coli (65.3% vs 70.8%; p=0.676). Eight E. Coli isolates were CTX-M positive; no other gene targets were detected. The most common suspected source of bacteremia was genitourinary (72.5% vs 72.3%; p=1.0). Time to optimal therapy was reduced by 29 hours [37 (31 – 55) vs. 8 (4 – 28); p=0.048). Length of stay and mortality was similar between groups. Conclusion Implementation of a rapid blood culture identification panel along with an antimicrobial stewardship intervention significantly reduced time to optimal therapy. Further studies are warranted to confirm these results. Disclosures All Authors: No reported disclosures


Sign in / Sign up

Export Citation Format

Share Document