scholarly journals Constructing Unit-Specific Empiric Treatment Guidelines for Catheter-Related and Primary Bacteremia by Determining the Likelihood of Inadequate Therapy

2012 ◽  
Vol 33 (4) ◽  
pp. 416-420 ◽  
Author(s):  
Megan E. Davis ◽  
Deverick J. Anderson ◽  
Michelle Sharpe ◽  
Luke F. Chen ◽  
Richard H. Drew

This study aimed to determine the feasibility of using likelihood of inadequate therapy (LIT), a parameter calculated by using pathogen frequency and in vitro susceptibility for determination of appropriate empiric antibiotic therapy for primary bloodstream infections. Our study demonstrates that LIT may reveal differences in traditional antibiograms.

2017 ◽  
Vol 4 (suppl_1) ◽  
pp. S13-S14
Author(s):  
Sameer S Kadri ◽  
Yi Ling Lai ◽  
Emily Ricotta ◽  
Jeffrey Strich ◽  
Ahmed Babiker ◽  
...  

Abstract Background Discordance between in vitro susceptibility and empiric antibiotic therapy is inextricably linked to antibiotic resistance and decreased survival in bloodstream infections (BSI). However, its prevalence, patient- and hospital-level risk factors, and impact on outcome in a large cohort and across different pathogens remain unclear. Methods We examined in vitro susceptibility interpretations for bacterial BSI and corresponding antibiotic therapy among inpatient encounters across 156 hospitals from 2000 to 2014 in the Cerner Healthfacts database. Discordance was defined as nonsusceptibility to initial therapy administered from 2 days before pathogen isolation to 1 day before final susceptibility reporting. Discordance prevalence was compared across taxa; risk factors and its association with in-hospital mortality were evaluated by logistic regression. Adjusted odds ratios (aOR) were estimated for pathogen-, patient- and facility-level factors. Results Of 33,161 unique encounters with BSIs, 4,219 (13%) at 123 hospitals met criteria for discordant antibiotic therapy, ranging from 3% for pneumococci to 55% for E. faecium. Discordance was higher in recent years (2010–2014 vs. 2005–2009) and was associated with older age, lower baseline SOFA score, urinary (vs. abdominal) source and hospital-onset BSI, as well as ≥500-bed, Midwestern, non-teaching, and rural hospitals. Discordant antibiotic therapy increased the risk of death [aOR = 1.3 [95% CI 1.1–1.4]). Among Gram-negative taxa, discordant therapy increased risk of mortality associated with Enterobacteriaceae (aOR = 1.3 [1.0–1.6]) and non-fermenters (aOR = 1.7 [1.1–2.5]). Among Gram-positive taxa, risk of mortality from discordant therapy was significantly higher for S. aureus (aOR = 1.3 [1.1–1.6]) but unchanged for streptococcal or enterococcal BSIs. Conclusion The prevalence of discordant antibiotic therapy displayed extensive taxon-level variability and was associated with patient and institutional factors. Discordance detrimentally impacted survival in Gram-negative and S. aureus BSIs. Understanding reasons behind observed differences in discordance risk and their impact on outcomes could inform stewardship efforts and guidelines for empiric therapy in sepsis. Disclosures All authors: No reported disclosures.


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S115-S115
Author(s):  
Brandon J Smith ◽  
Abigail Kois ◽  
Nathan Gartland ◽  
Joseph Tholany ◽  
Ricardo Arbulu

Abstract Background Appropriate empiric antibiotic therapy is associated with decreased mortality and recurrence in patients with Enterobacteriaceae bacteremia (EB). Increasing bacterial resistance adds an additional layer to this complex clinical scenario. Swift utilization of appropriate antibiotics is crucial for improved patient outcomes. However, prolonged and excessively broad antibiotic coverage is not without its own complications. Our study aimed to review the appropriateness of empiric antibiotics for EB. Methods A retrospective chart review of all patients >18 years of age who were admitted to a single academic community hospital during 2018 EB anytime throughout their hospitalization. The primary endpoint was the appropriateness of empiric antibiotic therapy, defined as receiving active therapy prior to the return of antimicrobial sensitivities that were susceptible to the empiric agents used. Appropriateness was further adjusted for standard of care (SOC) practices. Specifically, despite in vitro susceptibility of piperacillin/tazobactam and cefepime, carbapenem therapy is preferred for ESBL infections. Results Our study identified 178 patients with EB. Most common organisms included E.coli (64.6%), K. pneumoniae (11.8%) and P. mirabilis (7.3%). Resistance patterns included 1 CRE (0.57%) and 17 ESBL (9.7%) isolates. Most common sources of infection included urinary (63.5%) and intraabdominal (13.5%). Based on the sensitivity reports of tested isolates, 83.7% of patients received appropriate empiric antibiotics. After adjustment for SOC, 11.8% of ESBL patients (2/17) and 0% of CRE (0/1) patients received appropriate therapy. Comparatively 89.0% of patients without ESBL or CRE (137/154) received appropriate care (P < 0.0001). Conclusion The results of this study demonstrate that across our patient population, over 80% of patients received appropriate empiric antibiotics for EB; however, this percentage was dramatically lower for patients with ESBL or CRE infections. This highlights room for improved rapid diagnosis and identification of risk factors predisposing to resistant organisms thereby decreasing the time to appropriate antibiotic therapy. Disclosures All authors: No reported disclosures.


Critical Care ◽  
2021 ◽  
Vol 25 (1) ◽  
Author(s):  
Marin H. Kollef ◽  
Andrew F. Shorr ◽  
Matteo Bassetti ◽  
Jean-Francois Timsit ◽  
Scott T. Micek ◽  
...  

AbstractSevere or life threatening infections are common among patients in the intensive care unit (ICU). Most infections in the ICU are bacterial or fungal in origin and require antimicrobial therapy for clinical resolution. Antibiotics are the cornerstone of therapy for infected critically ill patients. However, antibiotics are often not optimally administered resulting in less favorable patient outcomes including greater mortality. The timing of antibiotics in patients with life threatening infections including sepsis and septic shock is now recognized as one of the most important determinants of survival for this population. Individuals who have a delay in the administration of antibiotic therapy for serious infections can have a doubling or more in their mortality. Additionally, the timing of an appropriate antibiotic regimen, one that is active against the offending pathogens based on in vitro susceptibility, also influences survival. Thus not only is early empiric antibiotic administration important but the selection of those agents is crucial as well. The duration of antibiotic infusions, especially for β-lactams, can also influence antibiotic efficacy by increasing antimicrobial drug exposure for the offending pathogen. However, due to mounting antibiotic resistance, aggressive antimicrobial de-escalation based on microbiology results is necessary to counterbalance the pressures of early broad-spectrum antibiotic therapy. In this review, we examine time related variables impacting antibiotic optimization as it relates to the treatment of life threatening infections in the ICU. In addition to highlighting the importance of antibiotic timing in the ICU we hope to provide an approach to antimicrobials that also minimizes the unnecessary use of these agents. Such approaches will increasingly be linked to advances in molecular microbiology testing and artificial intelligence/machine learning. Such advances should help identify patients needing empiric antibiotic therapy at an earlier time point as well as the specific antibiotics required in order to avoid unnecessary administration of broad-spectrum antibiotics.


2016 ◽  
Vol 29 (3) ◽  
pp. 159-163 ◽  
Author(s):  
Chagai Grossman ◽  
Nathan Keller ◽  
Gil Bornstein ◽  
Ilan Ben-Zvi ◽  
Nira Koren-Morag ◽  
...  

2020 ◽  
Vol 41 (S1) ◽  
pp. s44-s45
Author(s):  
Sameer Kadri ◽  
Yi Ling Lai ◽  
Sarah Warner ◽  
Jeffrey R. Strich ◽  
Ahmed Babiker ◽  
...  

Background: Delayed or in vitro inactive empiric antibiotic therapy may be detrimental to survival in patients with bloodstream infections (BSIs). Understanding the landscape of delayed or discordant empiric antibiotic therapy (DDEAT) across different patient, pathogen, and hospital types, as well as by their baseline resistance milieu, may enable providers, antimicrobial stewardship programs, and policy makers to optimize empiric prescribing. Methods: Inpatients with clinically suspected serious infection (based on sampling of blood cultures and receiving systemic antibiotic therapy on the same or next day) found to have BSI were identified in the Cerner Healthfacts EHR database. Patients were considered to have received DDEAT when, on culture sampling day, they received either no antibiotic(s) or none that displayed in vitro activity against the pathogenic bloodstream isolate. Antibiotic-resistant phenotypes were defined by in vitro resistance to taxon-specific prototype antibiotics (eg, methicillin/oxacillin resistance in S. aureus) and were used to estimate baseline resistance prevalence encountered by the hospital. The probability of DDEAT was examined by bacterial taxon, by time of BSI onset, and by presence versus absence of antibiotic-resistance phenotypes, sepsis or septic shock, hospital type, and baseline resistance. Results: Of 26,036 assessable patients with a BSI at 131 US hospitals between 2005 and 2014, 14,658 (56%) had sepsis, 3,623 (14%) had septic shock, 5,084 (20%) had antibiotic-resistant phenotypes, and 8,593 (33%) received DDEAT. Also, 4,428 (52%) recipients of DDEAT received no antibiotics on culture sampling day, whereas the remaining 4,165 (48%) received in vitro discordant therapy. DDEAT occurred most often in S. maltophilia (87%) and E. faecium (80%) BSIs; however, 75% of DDEAT cases and 76% of deaths among recipients of DDEAT collectively occurred among patients with S. aureus and Enterobacteriales BSIs. For every 8 bacteremic patients presenting with septic shock, 1 patient did not receive any antibiotics on culture day (Fig. 1A). Patients with BSIs of hospital (vs community) onset were twice as likely to receive no antibiotics on culture day, whereas those with bloodstream pathogens displaying antibiotic-resistant (vs susceptible) phenotypes were 3 times as likely to receive in vitro discordant therapy (Fig. 1B). The median proportion of DDEAT ranged between 25% (14, 37%) in eight <300-bed teaching hospitals in the lowest baseline resistance quartile and 40% (31, 50%) at five ≥300-bed teaching hospitals in the third baseline resistance quartile (Fig. 2). Conclusions: Delayed or in vitro discordant empiric antibiotic therapy is common among patients with BSI in US hospitals regardless of hospital size, teaching status, or local resistance patterns. Prompt empiric antibiotic therapy in septic shock and hospital-onset BSI needs more support. Reliable detection of S. aureus and Enterobacteriales bloodstream pathogens and their resistance patterns earlier with rapid point-of-care diagnostics may mitigate the population-level impact of DDEAT in BSI.Funding: This study was funded in part by the National Institutes of Health Clinical Center, National Institutes of Allergy and Infectious Diseases, National Cancer Institute (NCI contract no. HHSN261200800001E) and the Agency for Healthcare Research and Quality.Disclosures: None


Sign in / Sign up

Export Citation Format

Share Document