scholarly journals Timing of antibiotic therapy in the ICU

Critical Care ◽  
2021 ◽  
Vol 25 (1) ◽  
Author(s):  
Marin H. Kollef ◽  
Andrew F. Shorr ◽  
Matteo Bassetti ◽  
Jean-Francois Timsit ◽  
Scott T. Micek ◽  
...  

AbstractSevere or life threatening infections are common among patients in the intensive care unit (ICU). Most infections in the ICU are bacterial or fungal in origin and require antimicrobial therapy for clinical resolution. Antibiotics are the cornerstone of therapy for infected critically ill patients. However, antibiotics are often not optimally administered resulting in less favorable patient outcomes including greater mortality. The timing of antibiotics in patients with life threatening infections including sepsis and septic shock is now recognized as one of the most important determinants of survival for this population. Individuals who have a delay in the administration of antibiotic therapy for serious infections can have a doubling or more in their mortality. Additionally, the timing of an appropriate antibiotic regimen, one that is active against the offending pathogens based on in vitro susceptibility, also influences survival. Thus not only is early empiric antibiotic administration important but the selection of those agents is crucial as well. The duration of antibiotic infusions, especially for β-lactams, can also influence antibiotic efficacy by increasing antimicrobial drug exposure for the offending pathogen. However, due to mounting antibiotic resistance, aggressive antimicrobial de-escalation based on microbiology results is necessary to counterbalance the pressures of early broad-spectrum antibiotic therapy. In this review, we examine time related variables impacting antibiotic optimization as it relates to the treatment of life threatening infections in the ICU. In addition to highlighting the importance of antibiotic timing in the ICU we hope to provide an approach to antimicrobials that also minimizes the unnecessary use of these agents. Such approaches will increasingly be linked to advances in molecular microbiology testing and artificial intelligence/machine learning. Such advances should help identify patients needing empiric antibiotic therapy at an earlier time point as well as the specific antibiotics required in order to avoid unnecessary administration of broad-spectrum antibiotics.

2017 ◽  
Vol 4 (suppl_1) ◽  
pp. S13-S14
Author(s):  
Sameer S Kadri ◽  
Yi Ling Lai ◽  
Emily Ricotta ◽  
Jeffrey Strich ◽  
Ahmed Babiker ◽  
...  

Abstract Background Discordance between in vitro susceptibility and empiric antibiotic therapy is inextricably linked to antibiotic resistance and decreased survival in bloodstream infections (BSI). However, its prevalence, patient- and hospital-level risk factors, and impact on outcome in a large cohort and across different pathogens remain unclear. Methods We examined in vitro susceptibility interpretations for bacterial BSI and corresponding antibiotic therapy among inpatient encounters across 156 hospitals from 2000 to 2014 in the Cerner Healthfacts database. Discordance was defined as nonsusceptibility to initial therapy administered from 2 days before pathogen isolation to 1 day before final susceptibility reporting. Discordance prevalence was compared across taxa; risk factors and its association with in-hospital mortality were evaluated by logistic regression. Adjusted odds ratios (aOR) were estimated for pathogen-, patient- and facility-level factors. Results Of 33,161 unique encounters with BSIs, 4,219 (13%) at 123 hospitals met criteria for discordant antibiotic therapy, ranging from 3% for pneumococci to 55% for E. faecium. Discordance was higher in recent years (2010–2014 vs. 2005–2009) and was associated with older age, lower baseline SOFA score, urinary (vs. abdominal) source and hospital-onset BSI, as well as ≥500-bed, Midwestern, non-teaching, and rural hospitals. Discordant antibiotic therapy increased the risk of death [aOR = 1.3 [95% CI 1.1–1.4]). Among Gram-negative taxa, discordant therapy increased risk of mortality associated with Enterobacteriaceae (aOR = 1.3 [1.0–1.6]) and non-fermenters (aOR = 1.7 [1.1–2.5]). Among Gram-positive taxa, risk of mortality from discordant therapy was significantly higher for S. aureus (aOR = 1.3 [1.1–1.6]) but unchanged for streptococcal or enterococcal BSIs. Conclusion The prevalence of discordant antibiotic therapy displayed extensive taxon-level variability and was associated with patient and institutional factors. Discordance detrimentally impacted survival in Gram-negative and S. aureus BSIs. Understanding reasons behind observed differences in discordance risk and their impact on outcomes could inform stewardship efforts and guidelines for empiric therapy in sepsis. Disclosures All authors: No reported disclosures.


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S115-S115
Author(s):  
Brandon J Smith ◽  
Abigail Kois ◽  
Nathan Gartland ◽  
Joseph Tholany ◽  
Ricardo Arbulu

Abstract Background Appropriate empiric antibiotic therapy is associated with decreased mortality and recurrence in patients with Enterobacteriaceae bacteremia (EB). Increasing bacterial resistance adds an additional layer to this complex clinical scenario. Swift utilization of appropriate antibiotics is crucial for improved patient outcomes. However, prolonged and excessively broad antibiotic coverage is not without its own complications. Our study aimed to review the appropriateness of empiric antibiotics for EB. Methods A retrospective chart review of all patients >18 years of age who were admitted to a single academic community hospital during 2018 EB anytime throughout their hospitalization. The primary endpoint was the appropriateness of empiric antibiotic therapy, defined as receiving active therapy prior to the return of antimicrobial sensitivities that were susceptible to the empiric agents used. Appropriateness was further adjusted for standard of care (SOC) practices. Specifically, despite in vitro susceptibility of piperacillin/tazobactam and cefepime, carbapenem therapy is preferred for ESBL infections. Results Our study identified 178 patients with EB. Most common organisms included E.coli (64.6%), K. pneumoniae (11.8%) and P. mirabilis (7.3%). Resistance patterns included 1 CRE (0.57%) and 17 ESBL (9.7%) isolates. Most common sources of infection included urinary (63.5%) and intraabdominal (13.5%). Based on the sensitivity reports of tested isolates, 83.7% of patients received appropriate empiric antibiotics. After adjustment for SOC, 11.8% of ESBL patients (2/17) and 0% of CRE (0/1) patients received appropriate therapy. Comparatively 89.0% of patients without ESBL or CRE (137/154) received appropriate care (P < 0.0001). Conclusion The results of this study demonstrate that across our patient population, over 80% of patients received appropriate empiric antibiotics for EB; however, this percentage was dramatically lower for patients with ESBL or CRE infections. This highlights room for improved rapid diagnosis and identification of risk factors predisposing to resistant organisms thereby decreasing the time to appropriate antibiotic therapy. Disclosures All authors: No reported disclosures.


2012 ◽  
Vol 33 (4) ◽  
pp. 416-420 ◽  
Author(s):  
Megan E. Davis ◽  
Deverick J. Anderson ◽  
Michelle Sharpe ◽  
Luke F. Chen ◽  
Richard H. Drew

This study aimed to determine the feasibility of using likelihood of inadequate therapy (LIT), a parameter calculated by using pathogen frequency and in vitro susceptibility for determination of appropriate empiric antibiotic therapy for primary bloodstream infections. Our study demonstrates that LIT may reveal differences in traditional antibiograms.


2016 ◽  
Vol 17 (2) ◽  
pp. 210-216 ◽  
Author(s):  
Taku Oshima ◽  
Yoshiyuki Kodama ◽  
Waka Takahashi ◽  
Yosuke Hayashi ◽  
Shinya Iwase ◽  
...  

2012 ◽  
Vol 4 (1) ◽  
pp. 2 ◽  
Author(s):  
Desiree Caselli ◽  
Olivia Paolicchi

Improved outcome in the treatment of in childhood cancer results not only from more aggressive and tailored cancer-directed therapy, but also from improved supportive therapy and treatment of life-threatening infectious complications. Prompt and aggressive intervention with empiric antibiotics has reduced the mortality in this group of patients. Physical examination, blood tests, and blood cultures must be performed, and antibiotic therapy must be administered as soon as possible. Beta-lactam monotherapy, such as piperacillin-tazobactam or cefepime, may be an appropriate empiric therapy of choice for all clinically stable patients with neutropenic fever. An anti-pseudomonal beta-lactam antibiotic plus gentamicin is recommended for patients with systemic compromise.


2013 ◽  
Vol 7 (05) ◽  
pp. 424-431 ◽  
Author(s):  
Nazif Elaldi ◽  
Mustafa Gokhan Gozel ◽  
Fetiye Kolayli ◽  
Aynur Engin ◽  
Cem Celik ◽  
...  

In this report, a case of community-acquired acute bacterial meningitis (CA-ABM) caused by CTX-M-15-producing Escherichia coli in an elderly male patient was presented in the light of literature. Cultures of cerebrospinal fluid, blood, ear discharge, and stool samples yielded CTX-M-15-producing E. coli in-vitro, which was resistant to the extended-spectrum cephalosporins and ciprofloxacin and susceptible to imipenem, meropenem and amikacin. Meningitis was treated with parenteral meropenem plus parenteral and intraventricular amikacin administration. Since bacterial meningitis is a life-threatening infection, empiric antibiotic therapy with carbapenem can be started before the culture results are obtained, mainly in areas where the ESBL epidemiology is well known.


2020 ◽  
Vol 41 (S1) ◽  
pp. s44-s45
Author(s):  
Sameer Kadri ◽  
Yi Ling Lai ◽  
Sarah Warner ◽  
Jeffrey R. Strich ◽  
Ahmed Babiker ◽  
...  

Background: Delayed or in vitro inactive empiric antibiotic therapy may be detrimental to survival in patients with bloodstream infections (BSIs). Understanding the landscape of delayed or discordant empiric antibiotic therapy (DDEAT) across different patient, pathogen, and hospital types, as well as by their baseline resistance milieu, may enable providers, antimicrobial stewardship programs, and policy makers to optimize empiric prescribing. Methods: Inpatients with clinically suspected serious infection (based on sampling of blood cultures and receiving systemic antibiotic therapy on the same or next day) found to have BSI were identified in the Cerner Healthfacts EHR database. Patients were considered to have received DDEAT when, on culture sampling day, they received either no antibiotic(s) or none that displayed in vitro activity against the pathogenic bloodstream isolate. Antibiotic-resistant phenotypes were defined by in vitro resistance to taxon-specific prototype antibiotics (eg, methicillin/oxacillin resistance in S. aureus) and were used to estimate baseline resistance prevalence encountered by the hospital. The probability of DDEAT was examined by bacterial taxon, by time of BSI onset, and by presence versus absence of antibiotic-resistance phenotypes, sepsis or septic shock, hospital type, and baseline resistance. Results: Of 26,036 assessable patients with a BSI at 131 US hospitals between 2005 and 2014, 14,658 (56%) had sepsis, 3,623 (14%) had septic shock, 5,084 (20%) had antibiotic-resistant phenotypes, and 8,593 (33%) received DDEAT. Also, 4,428 (52%) recipients of DDEAT received no antibiotics on culture sampling day, whereas the remaining 4,165 (48%) received in vitro discordant therapy. DDEAT occurred most often in S. maltophilia (87%) and E. faecium (80%) BSIs; however, 75% of DDEAT cases and 76% of deaths among recipients of DDEAT collectively occurred among patients with S. aureus and Enterobacteriales BSIs. For every 8 bacteremic patients presenting with septic shock, 1 patient did not receive any antibiotics on culture day (Fig. 1A). Patients with BSIs of hospital (vs community) onset were twice as likely to receive no antibiotics on culture day, whereas those with bloodstream pathogens displaying antibiotic-resistant (vs susceptible) phenotypes were 3 times as likely to receive in vitro discordant therapy (Fig. 1B). The median proportion of DDEAT ranged between 25% (14, 37%) in eight <300-bed teaching hospitals in the lowest baseline resistance quartile and 40% (31, 50%) at five ≥300-bed teaching hospitals in the third baseline resistance quartile (Fig. 2). Conclusions: Delayed or in vitro discordant empiric antibiotic therapy is common among patients with BSI in US hospitals regardless of hospital size, teaching status, or local resistance patterns. Prompt empiric antibiotic therapy in septic shock and hospital-onset BSI needs more support. Reliable detection of S. aureus and Enterobacteriales bloodstream pathogens and their resistance patterns earlier with rapid point-of-care diagnostics may mitigate the population-level impact of DDEAT in BSI.Funding: This study was funded in part by the National Institutes of Health Clinical Center, National Institutes of Allergy and Infectious Diseases, National Cancer Institute (NCI contract no. HHSN261200800001E) and the Agency for Healthcare Research and Quality.Disclosures: None


Blood ◽  
2010 ◽  
Vol 116 (21) ◽  
pp. 2773-2773
Author(s):  
Chiara Cattaneo ◽  
Erika Borlenghi ◽  
Francesca Bracchi ◽  
Liana Signorini ◽  
Alessandro Re ◽  
...  

Abstract Abstract 2773 Introduction. Infections during chemotherapy-induced aplasia are still a problem in the management of acute leukemia (AL) patients (pts), causing potentially life-threatening consequences and treatment delays. Adequate empiric antibiotic therapy is crucial for a favourable clinical evolution. In order to better define the best antimicrobial management for AL pts during different phases of treatment, we analyzed all infectious events occurring to consecutively treated AL pts at our Institute during a period of six years. Patients and Methods. Since June 2004 a program of active epidemiological surveillance is ongoing at our Institute. Data concerning infections occurring during chemotherapy-induced cytopenia in AL pts were analysed. All pts showing fever or signs/symptoms of infection underwent thorax X-ray and culture of any other fluid/drainage obtained from a suspected infection site. CT scan of thorax was performed when fever persisted >48h. An infection was considered clinically documented (CDI) when pertinent symptoms, objective signs, or diagnostic radiological findings were present and microbiologically documented (MDI) when microorganisms were isolated. Results. From June 2004 to May 2010, 210 cases of AL (154 acute myeloid leukemia [AML], 53 acute lymphoblastic leukemia [ALL], and 3 blastic plasmacytoid dendritic cell leukemia), were diagnosed and treated with at least one induction cycle followed by consolidation and with reinduction cycles in relapsing/refractory pts. Overall, 1014 chemotherapy cycles were delivered, subdivided as induction (I) (210), consolidation (C) (708) and salvage (S) (96) treatment. Overall 309 clinically documented infections (CDI) were observed (30.5%). Incidence of CDI was higher during S therapy in comparison with I or C (77.1% vs 41.9% and 20.7%, p<0.0001). Incidence of pneumonia was similar in S and I phase (18.7% vs 17.6%) and significantly higher than in C (1.8%, p<0.0001). Incidence of bloodstream infections (BSI) was similar during I and C phase (20% and 15%, p=0.09) and significantly lower than in S (54.2%, p<0.0001). MDI were diagnosed in 270/1014 cycles (26.6%). Isolates were Gram negative (G-) in 54.8%, Gram-positive (G+) in 32.6% and fungi (F, moulds only) in 2,6% of cases; in 27 cases (10%) a mixed infection was documented. Frequency of fungal infections was higher during I therapy (6.9%) than in C+S (1%, p=0.016). Epidemiological distribution of G+ and G- infections during different phases was similar, with the exception of a lower frequency of G- during I (41.7%) vs C+S (59.6%, p=0.012). Mixed infections were more frequent during I (16.7%) than C+S (7.6%, p=0.038). Overall, 297 pathogens were isolated. S. aureus (9/270, 3.3%) and coagulase-negative staphylococci (43/270, 15.9%) were more frequent during I than C+S (respectively: 6.9% vs 2%, p=0.059 and 25% vs 12.6%, p=0.02); E. coli infections (92/270, 34.1%) were predominant during C (52.5%) in comparison with I+S (25.7%, p=0.004). Enterococci (30/270, 11.1%) and P. aeruginosa infections (52/270, 19.6%) were uniformly distributed during different phases. Death occurred in 19 cases (6 and 13 during I and S, respectively). At univariate analysis, S phase (p<0.0001), P. aeruginosa and S. aureus, alone or in association with other pathogens, emerged as poor prognostic factors (p=0.002 and 0.016, respectively). Two of the 7 cases of probable aspergillosis died during I. Conclusions. The S phase has the highest infectious risk, particularly for BSI. Both prophylactic and empiric antibiotic therapy guided by epidemiological data seem warranted. In the I phase pneumonia, particularly of mycotic origin, is relatively more frequent, confirming the appropriateness of an effective antifungal prophylaxis. The C phase carries a very limited risk of life-threatening infections and a relatively high incidence of E. coli. Therefore, the need for antimicrobial and antifungal prophylaxis during C may be reconsidered. Overall the frequency of bacterial infections largely outweighs that of fungal infections and is responsible for 84% of infectious deaths. Among bacteria, P. aeruginosa ranks as the second more frequent microorganism after E. coli and carries the highest risk of death. Given its intrinsic ability of developing antibiotic resistance, it should be presently considered as the most threatening infectious agent in AL against which empiric antibiotic therapy should be tailored. Disclosures: No relevant conflicts of interest to declare.


2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S778-S778
Author(s):  
Jessica L Seadler ◽  
Natalie Tucker ◽  
Beth Cady ◽  
Praveen Mullangi

Abstract Background Diabetic foot infections (DFI) are a potentially devastating complication for patients with diabetes. When treating these patients, there is a need for selection of highly effective antibiotics coupled with a need to avoid excessive use of broad-spectrum antimicrobial agents that could lead to adverse patient outcomes. At this institution, it has been observed that there is a lack of compliance with consensus guideline recommendations for the choice of empiric antibiotic therapy for DFI, leading to overuse of broad-spectrum antibiotics. Methods A retrospective chart review was performed for hospitalized patients over 18 years of age that received antibiotics for DFI during the period of August 1, 2018 to July 31, 2019. Patients were excluded if they were continuing outpatient antibiotics for an existing DFI, were being treated with antibiotics for a concurrent infection, or were pregnant. The primary objective was the rate of guideline-compliant empiric antibiotic regimens as broken down by infection severity. Secondary objectives included the duration of antibiotic therapy per patient, and rates of empiric methicillin-resistant Staphylococcus aureus (MRSA) and P. aeruginosa coverage. Results A total of 114 patients were included in the analysis. A majority of patients had an infection of moderate severity (65.8%), followed by 19.3% with severe infections, and 14.9% with mild infections. In the total population, only 26.3% of patients received empiric antibiotic regimens that were guideline-compliant. A large percentage of patients received empiric anti-MRSA antibiotics (95.6%) and empiric anti-pseudomonal agents (89.5%). Use of these broad-spectrum agents did not differ by infection severity. Ninety-nine (86.8%) patients had a site culture collected. S. aureus was the most commonly isolated organism and there was a low rate of P. aeruginosa (10.1%). Conclusion There is room for improvement in the management of DFI at this institution. A specific area that has been identified is the overuse of empiric anti-pseudomonal agents in patients without risk factors for P. aeruginosa. The results of this study will be evaluated alongside consensus guidelines and used to create institution-specific treatment guidance that providers can employ to optimize the management of DFI. Disclosures All Authors: No reported disclosures


2021 ◽  
Vol 15 (09) ◽  
pp. 1346-1350
Author(s):  
Anandhalakshmi Subramanian ◽  
Sandhya Bhat ◽  
Sudhagar Mookkappan ◽  
Patricia Anitha ◽  
Ravichandran Kandasamy ◽  
...  

Introduction: Urosepsis is life threatening, unless treated immediately. Empirical treatment with appropriate antibiotics lowers the risk of a poor outcome. However, with increasing resistance among common uropathogens, there is a need for continuous review of the existing protocol to determine whether there is a correlation between empirical antibiotic therapy and in-vitro susceptibility pattern of the pathogens causing urosepsis. Methodology: A prospective study was carried out on 66 confirmed cases of urosepsis from January 2017 to December 2018 after obtaining ethical clearance. Demographic details, risk factors, length of hospital stay, bacteriological profile, empirical antibiotic given, and change in antibiotic following susceptibility report and outcome was recorded. Results: Among the 66 urosepsis cases 63 of them were started on empiric antibiotic. The correlation between the empirical antibiotic given and the in-vitro antimicrobial susceptibility was found to be significant with a p value < 0.0001. Among the 63 for whom empiric antibiotics was started further escalation of antibiotic was done in 46 patients. The remaining 20% of cases were changed over to a different antibiotic, in line with susceptibility report. The mortality rate was (15.1%) with a confidence interval of (CI = 15 ± 3.5). The association between the risk factors for urosepsis and their effect on mortality rate was analyzed. Diabetes mellitus and chronic kidney disease were identified as important independent risk factors and had direct influence on the mortality rate with significant p value of 0.0281 and 0.0015 respectively. Conclusions: A significant correlation was identified between the empirical antibiotic given and in-vitro antibiotic susceptibility pattern.


Sign in / Sign up

Export Citation Format

Share Document