scholarly journals Stratification of the Impact of Inappropriate Empirical Antimicrobial Therapy for Gram-Negative Bloodstream Infections by Predicted Prognosis

2014 ◽  
Vol 59 (1) ◽  
pp. 245-250 ◽  
Author(s):  
Sarah E. Cain ◽  
Joseph Kohn ◽  
P. Brandon Bookstaver ◽  
Helmut Albrecht ◽  
Majdi N. Al-Hasan

ABSTRACTThe bloodstream infection mortality risk score (BSIMRS) predicts the outcome of patients with Gram-negative bloodstream infections (BSI) with high discrimination. This retrospective cohort study examined the impact of inappropriate antimicrobial therapy on mortality in adult patients with Gram-negative BSI admitted to Palmetto Health Hospitals in Columbia, SC, USA, from 1 January 2011 to 31 December 2012 after stratification by predicted prognosis at initial presentation using BSIMRS. A multivariate Cox regression model was used to identify independent risk factors for 28-day mortality overall and within each predefined BSIMRS category (<5, 5 to 9, and ≥10). Relative risk reduction (RRR), absolute risk reduction (ARR), and number needed to treat (NNT) were calculated from a predictive logistic regression model of mortality. Overall, 390 unique patients with first episodes of Gram-negative BSI were identified. The median age was 66 years, and 229 (59%) were women. There was significant association between inappropriate antimicrobial therapy and mortality in patients with BSIMRS of 5 to 9 (adjusted hazard ratio [aHR], 3.55; 95% confidence intervals [CI], 1.22 to 8.31;P= 0.02) and BSIMRS of ≥10 (aHR, 4.99; 95% CI, 1.09 to 22.87;P= 0.04) but not in those with BSIMRS of <5 (aHR, 3.34; 95% CI, 0.17 to 22.77;P= 0.34). RRR, ARR, and NNT were 0.25, 0.02, and 63 for BSIMRS of <5; 0.56, 0.32, and 3 for BSIMRS of 5 to 9; and 0.39, 0.39, and 3 for BSIMRS of ≥10, respectively. There is a significant benefit from appropriate antimicrobial therapy in patients with Gram-negative BSI with guarded (BSIMRS of 5 to 9) and poor (BSIMRS of ≥10) predicted prognosis. Survival difference remains unclear among those with good predicted prognosis (BSIMRS of <5) at initial presentation.

2021 ◽  
Vol 8 (Supplement_1) ◽  
pp. S145-S145
Author(s):  
Madison Donnelly ◽  
Jennifer Walls ◽  
Katlyn Wood ◽  
Aiman Bandali

Abstract Background Gram-negative bacteremia is associated with significant morbidity and mortality. Development of an algorithm for antimicrobial selection, using institution-specific antibiogram data and rapid diagnostics (RDT), achieves timely and appropriate antimicrobial therapy. The objective of this study is to assess the impact of a pharmacy-driven antimicrobial stewardship initiative in conjunction with ePlex® BCID on time to optimal antimicrobial therapy for patients with gram-negative bloodstream infections. Methods This retrospective, observational, single-center study included adult patients with a documented gram-negative bloodstream infection in whom the ePlex® BCID was employed. A pharmacist-driven antimicrobial stewardship intervention was initiated on December 1, 2020; pre-intervention (December 2019 – March 2020) was compared to the post-intervention (December 2020 – February 2020) period. The following organisms were included: Citrobacter spp., Escherichia coli, Klebsiella aerogenes/pneumoniae/oxytoca, Proteus spp, Enterobacter spp., Pseudomonas aeruginosa, and Acinetobacter baumannii. Polymicrobial bloodstream infections or those who had an ePlex® panel performed prior to admission were excluded. The following clinical outcomes were assessed: time to optimal antimicrobial therapy, length of stay (LOS), and inpatient-30-day mortality. Results One hundred and sixty-three met criteria for inclusion; 98 patients in the pre-intervention group and 65 patients in the post-intervention group. The mean Pitt Bacteremia Score was 1 in both groups (p=0.741). The most common organism identified by ePlex® BCID was E. coli (65.3% vs 70.8%; p=0.676). Eight E. Coli isolates were CTX-M positive; no other gene targets were detected. The most common suspected source of bacteremia was genitourinary (72.5% vs 72.3%; p=1.0). Time to optimal therapy was reduced by 29 hours [37 (31 – 55) vs. 8 (4 – 28); p=0.048). Length of stay and mortality was similar between groups. Conclusion Implementation of a rapid blood culture identification panel along with an antimicrobial stewardship intervention significantly reduced time to optimal therapy. Further studies are warranted to confirm these results. Disclosures All Authors: No reported disclosures


2017 ◽  
Vol 61 (6) ◽  
Author(s):  
Joshua T. Thaden ◽  
Lawrence P. Park ◽  
Stacey A. Maskarinec ◽  
Felicia Ruffin ◽  
Vance G. Fowler ◽  
...  

ABSTRACT The impact of bacterial species on outcome in bloodstream infections (BSI) is incompletely understood. We evaluated the impact of bacterial species on BSI mortality, with adjustment for patient, bacterial, and treatment factors. From 2002 to 2015, all adult inpatients with monomicrobial BSI caused by Staphylococcus aureus or Gram-negative bacteria at Duke University Medical Center were prospectively enrolled. Kaplan-Meier curves and multivariable Cox regression with propensity score models were used to examine species-specific bacterial BSI mortality. Of the 2,659 enrolled patients, 999 (38%) were infected with S. aureus, and 1,660 (62%) were infected with Gram-negative bacteria. Among patients with Gram-negative BSI, Enterobacteriaceae (81% [1,343/1,660]) were most commonly isolated, followed by non-lactose-fermenting Gram-negative bacteria (16% [262/1,660]). Of the 999 S. aureus BSI isolates, 507 (51%) were methicillin resistant. Of the 1,660 Gram-negative BSI isolates, 500 (30%) were multidrug resistant. The unadjusted time-to-mortality among patients with Gram-negative BSI was shorter than that of patients with S. aureus BSI (P = 0.003), due to increased mortality in patients with non-lactose-fermenting Gram-negative BSI generally (P < 0.0001) and Pseudomonas aeruginosa BSI (n = 158) in particular (P < 0.0001). After adjustment for patient demographics, medical comorbidities, bacterial antibiotic resistance, timing of appropriate antibiotic therapy, and source control in patients with line-associated BSI, P. aeruginosa BSI remained significantly associated with increased mortality (hazard ratio = 1.435; 95% confidence interval = 1.043 to 1.933; P = 0.02). P. aeruginosa BSI was associated with increased mortality relative to S. aureus or other Gram-negative BSI. This effect persisted after adjustment for patient, bacterial, and treatment factors.


BMC Cancer ◽  
2021 ◽  
Vol 21 (1) ◽  
Author(s):  
Jie Wu ◽  
Yu-Chen Wang ◽  
Wen-Jie Luo ◽  
Bo-Dai ◽  
Ding-Wei Ye ◽  
...  

Abstract Background Primary urethral carcinoma (PUC) is a rare genitourinary malignancy with a relatively poor prognosis. The aim of this study was to examine the impact of surgery on survival of patients diagnosed with PUC. Methods A total of 1544 PUC patients diagnosed between 2004 and 2016 were identified based on the SEER database. The Kaplan-Meier estimate and the Fine and Gray competing risks analysis were performed to assess overall survival (OS) and cancer-specific mortality (CSM). The multivariate Cox regression model and competing risks regression model were used to identify independent risk factors of OS and cancer-specific survival (CSS). Results The 5-yr OS was significantly better in patients who received either local therapy (39.8%) or radical surgery (44.7%) compared to patients receiving no surgery of the primary site (21.5%) (p < 0.001). Both local therapy and radical surgery were each independently associated with decreased CSM, with predicted 5-yr cumulative incidence of 45.4 and 43.3%, respectively, compared to 64.7% for patients receiving no surgery of the primary site (p < 0.001). Multivariate analyses demonstrated that primary site surgery was independently associated with better OS (local therapy, p = 0.037; radical surgery, p < 0.001) and decreased CSM (p = 0.003). Similar results were noted regardless of age, sex, T stage, N stage, and AJCC prognostic groups based on subgroup analysis. However, patients with M1 disease who underwent primary site surgery did not exhibit any survival benefit. Conclusion Surgery for the primary tumor conferred a survival advantage in non-metastatic PUC patients.


Author(s):  
Evan D Robinson ◽  
Allison M Stilwell ◽  
April E Attai ◽  
Lindsay E Donohue ◽  
Megan D Shah ◽  
...  

Abstract Background Implementation of the Accelerate PhenoTM Gram-negative platform (RDT) paired with antimicrobial stewardship program (ASP) intervention projects to improve time to institutional-preferred antimicrobial therapy (IPT) for Gram-negative bacilli (GNB) bloodstream infections (BSIs). However, few data describe the impact of discrepant RDT results from standard of care (SOC) methods on antimicrobial prescribing. Methods A single-center, pre-/post-intervention study of consecutive, nonduplicate blood cultures for adult inpatients with GNB BSI following combined RDT + ASP intervention was performed. The primary outcome was time to IPT. An a priori definition of IPT was utilized to limit bias and to allow for an assessment of the impact of discrepant RDT results with the SOC reference standard. Results Five hundred fourteen patients (PRE 264; POST 250) were included. Median time to antimicrobial susceptibility testing (AST) results decreased 29.4 hours (P &lt; .001) post-intervention, and median time to IPT was reduced by 21.2 hours (P &lt; .001). Utilization (days of therapy [DOTs]/1000 days present) of broad-spectrum agents decreased (PRE 655.2 vs POST 585.8; P = .043) and narrow-spectrum beta-lactams increased (69.1 vs 141.7; P &lt; .001). Discrepant results occurred in 69/250 (28%) post-intervention episodes, resulting in incorrect ASP recommendations in 10/69 (14%). No differences in clinical outcomes were observed. Conclusions While implementation of a phenotypic RDT + ASP can improve time to IPT, close coordination with Clinical Microbiology and continued ASP follow up are needed to optimize therapy. Although uncommon, the potential for erroneous ASP recommendations to de-escalate to inactive therapy following RDT results warrants further investigation.


2017 ◽  
Vol 61 (9) ◽  
Author(s):  
P. B. Bookstaver ◽  
E. B. Nimmich ◽  
T. J. Smith ◽  
J. A. Justo ◽  
J. Kohn ◽  
...  

ABSTRACT The use of rapid diagnostic tests (RDTs) enhances antimicrobial stewardship program (ASP) interventions in optimization of antimicrobial therapy. This quasi-experimental cohort study evaluated the combined impact of an ASP/RDT bundle on the appropriateness of empirical antimicrobial therapy (EAT) and time to de-escalation of broad-spectrum antimicrobial agents (BSAA) in Gram-negative bloodstream infections (GNBSI). The ASP/RDT bundle consisted of system-wide GNBSI treatment guidelines, prospective stewardship monitoring, and sequential introduction of two RDTs, matrix-assisted laser desorption ionization–time of flight mass spectrometry (MALDI-TOF MS) and the FilmArray blood culture identification (BCID) panel. The preintervention period was January 2010 through December 2013, and the postintervention period followed from January 2014 through June 2015. The postintervention period was conducted in two phases; phase 1 followed the introduction of MALDI-TOF MS, and phase 2 followed the introduction of the FilmArray BCID panel. The interventions resulted in significantly improved appropriateness of EAT (95% versus 91%; P = 0.02). Significant reductions in median time to de-escalation from combination antimicrobial therapy (2.8 versus 1.5 days), antipseudomonal beta-lactams (4.0 versus 2.5 days), and carbapenems (4.0 versus 2.5 days) were observed in the postintervention compared to the preintervention period (P < 0.001 for all). The reduction in median time to de-escalation from combination therapy (1.0 versus 2.0 days; P = 0.03) and antipseudomonal beta-lactams (2.2 versus 2.7 days; P = 0.04) was further augmented during phase 2 compared to phase 1 of the postintervention period. Implementation of an antimicrobial stewardship program and RDT intervention bundle in a multihospital health care system is associated with improved appropriateness of EAT for GNBSI and decreased utilization of BSAA through early de-escalation.


Author(s):  
Mariana Chumbita ◽  
Pedro Puerta-Alcalde ◽  
Carlota Gudiol ◽  
Nicole Garcia-Pouton ◽  
Júlia Laporte-Amargós ◽  
...  

Objectives: We analyzed risk factors for mortality in febrile neutropenic patients with bloodstream infections (BSI) presenting with septic shock and assessed the impact of empirical antibiotic regimens. Methods: Multicenter retrospective study (2010-2019) of two prospective cohorts comparing BSI episodes in patients with or without septic shock. Multivariate analysis was performed to identify independent risk factors for mortality in episodes with septic shock. Results: Of 1563 patients with BSI, 257 (16%) presented with septic shock. Those patients with septic shock had higher mortality than those without septic shock (55% vs 15%, p<0.001). Gram-negative bacilli caused 81% of episodes with septic shock; gram-positive cocci, 22%; and Candida species 5%. Inappropriate empirical antibiotic treatment (IEAT) was administered in 17.5% of septic shock episodes. Empirical β-lactam combined with other active antibiotics was associated with the lowest mortality observed. When amikacin was the only active antibiotic, mortality was 90%. Addition of empirical specific gram-positive coverage had no impact on mortality. Mortality was higher when IEAT was administered (76% vs 51%, p=0.002). Age >70 years (OR 2.3, 95% CI 1.2-4.7), IEAT for Candida spp. or gram-negative bacilli (OR 3.8, 1.3-11.1), acute kidney injury (OR 2.6, 1.4-4.9) and amikacin as the only active antibiotic (OR 15.2, 1.7-134.5) were independent risk factors for mortality, while combination of β-lactam and amikacin was protective (OR 0.32, 0.18-0.57). Conclusions: Septic shock in febrile neutropenic patients with BSI is associated with extremely high mortality, especially when IEAT is administered. Combination therapy including an active β-lactam and amikacin results in the best outcomes.


2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S61-S61
Author(s):  
Evan D Robinson ◽  
Heather L Cox ◽  
April E Attai ◽  
Lindsay Donohue ◽  
Megan Shah ◽  
...  

Abstract Background Implementation of the Accelerate PhenoTM Gram-negative platform (AXDX) paired with ASP intervention projects to improve time to definitive institutional-preferred antimicrobial therapy (IPT). However, few data describe the impact of discrepant RDT results from standard of care (SOC) methods on antimicrobial prescribing. Here we evaluate the prescribing outcomes for discrepant results following the first year of AXDX + ASP implementation. Methods Consecutive, non-duplicate blood cultures for adult inpatients with GNB BSI following combined RDT + ASP intervention were included (July 2018 – July 2019). AXDX results were emailed to the ASP in real time then released into the EMR upon ASP review and communication with the treating team. SOC identification (ID; Vitek® MS/Vitek® 2) and antimicrobial susceptibility testing (AST; Trek SensititreTM) followed RDT as the reference standard. IPT was defined as the narrowest susceptible beta-lactam, and a discrepancy was characterized when there was categorical disagreement between RDT and SOC methods. When IPT by AXDX was found to be non-susceptible on SOC, this was characterized as “false susceptible“. Conversely, “false resistance” was assessed when a narrower-spectrum agent was susceptible by SOC. Results were also deemed discrepant when the AXDX provided no/incorrect ID for on-panel organisms, no AST, or a polymicrobial specimen was missed. Results Sixty-nine of 250 patients (28%) had a discrepancy in organism ID or AST: false resistance (9%), false susceptible (5%), no AST (5%), no ID (4%), incorrect ID (2%), and missed polymicrobial (2%). A prescribing impact occurred in 55% of cases (Table 1), where unnecessarily broad therapy was continued most often. Erroneous escalation (7%) and de-escalation to inactive therapy (7%) occurred less frequently. In-hospital mortality occurred in 4 cases, none of which followed an inappropriate transition to inactive therapy. Conclusion Though the AXDX platform provides rapid ID and AST results, close coordination with Clinical Microbiology and continued ASP follow up are needed to optimize therapy. Although uncommon, the potential for erroneous ASP recommendations to de-escalate to inactive therapy following AXDX results warrants further investigation. Disclosures Amy J. Mathers, MD, D(ABMM), Accelerate Diagnostics (Consultant)


2019 ◽  
Vol 19 (1) ◽  
Author(s):  
Gerald Elliott ◽  
Michael Malczynski ◽  
Viktorjia O. Barr ◽  
Doaa Aljefri ◽  
David Martin ◽  
...  

Abstract Background Initiating early effective antimicrobial therapy is the most important intervention demonstrated to decrease mortality in patients with gram-negative bacteremia with sepsis. Rapid MIC-based susceptibility results make it possible to optimize antimicrobial use through both escalation and de-escalation. Method We prospectively evaluated the performance of the Accelerate Pheno™ system (AXDX) for identification and susceptibility testing of gram-negative species and compared the time to result between AXDX and routine standard of care (SOC) using 82 patient samples and 18 challenge organisms with various confirmed resistance mechanisms. The potential impact of AXDX on time to antimicrobial optimization was investigated with various simulated antimicrobial stewardship (ASTEW) intervention models. Results The overall positive and negative percent agreement of AXDX for identification were 100 and 99.9%, respectively. Compared to VITEK® 2, the overall essential agreement was 96.1% and categorical agreement was 95.4%. No very major or major errors were detected. AXDX reduced the time to identification by an average of 11.8 h and time to susceptibility by an average of 36.7 h. In 27 patients evaluated for potential clinical impact of AXDX on antimicrobial optimization, 18 (67%) patients could potentially have had therapy optimized sooner with an average of 18.1 h reduction in time to optimal therapy. Conclusion Utilization of AXDX coupled with simulated ASTEW intervention notification substantially shortened the time to potential antimicrobial optimization in this cohort of patients with gram-negative bacteremia. This improvement in time occurred when ASTEW support was limited to an 8-h coverage model.


2019 ◽  
Vol 50 (2) ◽  
pp. 126-132 ◽  
Author(s):  
Crystal A. Farrington ◽  
Michael Allon

Background: Catheter-related bloodstream infections ­(CRBSI) are associated with a high burden of morbidity and mortality, but the impact of infecting organism on clinical outcomes has been poorly studied. Methods: This retrospective analysis of a prospective vascular access database from a large academic dialysis center investigated whether the organism type affected the clinical presentation or complications of CRBSI. Results: Among 339 patients with suspected CRBSI, an alternate source of infection was identified in 50 (15%). Of 289 patients with CRBSI, 249 grew a single organism and 40 were polymicrobial. Fever and/or rigors were presenting signs in ≥90% of patients with Staphylococcus aureus or Gram-negative CRBSI, but only 61% of Staphylococcus epidermidis infections (p < 0.001). Hospitalization occurred in 67% of patients with S. aureus CRBSI versus 34% of those with S. epidermidis and 40% of those with a Gram-negative bacteria (p < 0.001). Admission to the intensive care unit was required in 14, 9, and 2% (p = 0.06); metastatic infection occurred in 10, 4, and 4% (p = 0.42); and median length of stay among patients admitted to the hospital was 4, 4, and 5.5 days (p = 0.60), respectively. Death due to CRBSI occurred in only 1% of patients with CRBSI. Conclusion: CRBSI is confirmed in 85% of catheter-dependent hemodialysis patients in whom it is suspected. S. epidermidis CRBSI tends to present with atypical symptoms. S. aureus CRBSI is more likely to require hospitalization or intensive care admission. Metastatic infection is relatively uncommon, and death due to CRBSI is rare.


Sign in / Sign up

Export Citation Format

Share Document