scholarly journals Impact of Initial Antimicrobial Therapy in Patients with Bloodstream Infections Caused by Stenotrophomonas maltophilia

2005 ◽  
Vol 49 (9) ◽  
pp. 3980-3981 ◽  
Author(s):  
Gokhan Metan ◽  
Omrum Uzun
2008 ◽  
Vol 52 (9) ◽  
pp. 3244-3252 ◽  
Author(s):  
Mario Tumbarello ◽  
Michela Sali ◽  
Enrico Maria Trecarichi ◽  
Fiammetta Leone ◽  
Marianna Rossi ◽  
...  

ABSTRACT Extended-spectrum-β-lactamase (ESBL)-producing strains of Escherichia coli are a significant cause of bloodstream infections (BSI) in hospitalized and nonhospitalized patients. We previously showed that delaying effective antimicrobial therapy in BSI caused by ESBL producers significantly increases mortality. The aim of this retrospective 7-year analysis was to identify risk factors for inadequate initial antimicrobial therapy (IIAT) (i.e., empirical treatment based on a drug to which the isolate had displayed in vitro resistance) for inpatients with BSI caused by ESBL-producing E. coli. Of the 129 patients considered, 56 (43.4%) received IIAT for 48 to 120 h (mean, 72 h). Independent risk factors for IIAT include an unknown BSI source (odds ratios [OR], 4.86; 95% confidence interval [CI], 1.98 to 11.91; P = 0.001), isolate coresistance to ≥3 antimicrobials (OR, 3.73; 95% CI, 1.58 to 8.83; P = 0.003), hospitalization during the 12 months preceding BSI onset (OR, 3.33; 95% CI, 1.42 to 7.79; P = 0.005), and antimicrobial therapy during the 3 months preceding BSI onset (OR, 2.65; 95% CI, 1.11 to 6.29; P = 0.02). IIAT was the strongest risk factor for 21-day mortality and significantly increased the length of hospitalization after BSI onset. Our results underscore the need for a systematic approach to the management of patients with serious infections by ESBL-producing E. coli. Such an approach should be based on sound, updated knowledge of local infectious-disease epidemiology, detailed analysis of the patient's history with emphasis on recent contact with the health care system, and aggressive attempts to identify the infectious focus that has given rise to the BSI.


2007 ◽  
Vol 51 (6) ◽  
pp. 1987-1994 ◽  
Author(s):  
Mario Tumbarello ◽  
Maurizio Sanguinetti ◽  
Eva Montuori ◽  
Enrico M. Trecarichi ◽  
Brunella Posteraro ◽  
...  

ABSTRACT Bloodstream infections (BSI) caused by extended-spectrum β-lactamase (ESBL)-producing organisms markedly increase the rates of treatment failure and death. We conducted a retrospective cohort analysis to identify risk factors for mortality in adult in-patients with BSI caused by ESBL-producing Enterobacteriaceae (ESBL-BSI). Particular attention was focused on defining the impact on the mortality of inadequate initial antimicrobial therapy (defined as the initiation of treatment with active antimicrobial agents >72 h after collection of the first positive blood culture). A total of 186 patients with ESBL-BSI caused by Escherichia coli (n = 104), Klebsiella pneumoniae (n = 58), or Proteus mirabilis (n = 24) were identified by our microbiology laboratory from 1 January 1999 through 31 December 2004. The overall 21-day mortality rate was 38.2% (71 of 186). In multivariate analysis, significant predictors of mortality were inadequate initial antimicrobial therapy (odds ratio [OR] = 6.28; 95% confidence interval [CI] = 3.18 to 12.42; P < 0.001) and unidentified primary infection site (OR = 2.69; 95% CI = 1.38 to 5.27; P = 0.004). The inadequately treated patients (89 of 186 [47.8%]) had a threefold increase in mortality compared to the adequately treated group (59.5% versus 18.5%; OR = 2.38; 95% CI = 1.76 to 3.22; P < 0.001). The regimens most commonly classified as inadequate were based on oxyimino cephalosporin or fluoroquinolone therapy. Prompt initiation of effective antimicrobial treatment is essential in patients with ESBL-BSI, and empirical decisions must be based on a sound knowledge of the local distribution of pathogens and their susceptibility patterns.


2017 ◽  
Vol 61 (9) ◽  
Author(s):  
P. B. Bookstaver ◽  
E. B. Nimmich ◽  
T. J. Smith ◽  
J. A. Justo ◽  
J. Kohn ◽  
...  

ABSTRACT The use of rapid diagnostic tests (RDTs) enhances antimicrobial stewardship program (ASP) interventions in optimization of antimicrobial therapy. This quasi-experimental cohort study evaluated the combined impact of an ASP/RDT bundle on the appropriateness of empirical antimicrobial therapy (EAT) and time to de-escalation of broad-spectrum antimicrobial agents (BSAA) in Gram-negative bloodstream infections (GNBSI). The ASP/RDT bundle consisted of system-wide GNBSI treatment guidelines, prospective stewardship monitoring, and sequential introduction of two RDTs, matrix-assisted laser desorption ionization–time of flight mass spectrometry (MALDI-TOF MS) and the FilmArray blood culture identification (BCID) panel. The preintervention period was January 2010 through December 2013, and the postintervention period followed from January 2014 through June 2015. The postintervention period was conducted in two phases; phase 1 followed the introduction of MALDI-TOF MS, and phase 2 followed the introduction of the FilmArray BCID panel. The interventions resulted in significantly improved appropriateness of EAT (95% versus 91%; P = 0.02). Significant reductions in median time to de-escalation from combination antimicrobial therapy (2.8 versus 1.5 days), antipseudomonal beta-lactams (4.0 versus 2.5 days), and carbapenems (4.0 versus 2.5 days) were observed in the postintervention compared to the preintervention period (P < 0.001 for all). The reduction in median time to de-escalation from combination therapy (1.0 versus 2.0 days; P = 0.03) and antipseudomonal beta-lactams (2.2 versus 2.7 days; P = 0.04) was further augmented during phase 2 compared to phase 1 of the postintervention period. Implementation of an antimicrobial stewardship program and RDT intervention bundle in a multihospital health care system is associated with improved appropriateness of EAT for GNBSI and decreased utilization of BSAA through early de-escalation.


2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S47-S47
Author(s):  
Bryant M Froberg ◽  
Nicholas Torney

Abstract Background As many as 1 in 3 patients with bloodstream infections at community hospitals receive inappropriate empiric antimicrobial therapy. Studies have shown that the coupling of real-time intervention with rapid pathogen identification improves patient outcomes and decreases health-system costs at large, tertiary academic centers. The aim of this study was to assess if similar outcomes could be obtained with the implementation of real-time pharmacist intervention to rapid pathogen identification at two smaller, rural community hospitals. Methods This was a pre-post implementation study that occurred from September of 2019 to March 2020. This study included patients ≥18 years of age admitted with one positive blood culture. Patients were excluded if they were pregnant, had a polymicrobial blood culture, known culture prior to admission, hospice consulted prior to admission, expired prior to positive blood culture, or transferred to another hospital within 24 hours of a positive blood culture. Endpoints of patients prior to intervention were compared to patients post-implementation. The primary endpoint was time to optimal antimicrobial therapy. Secondary endpoints included time to effective antimicrobial therapy, in-hospital mortality, length of hospital stay, and overall cost of hospitalization. Results Of 212 patients screened, 88 patients were included with 44 patients in each group. Both groups were similar in terms of comorbidities, infection source, and causative microbial. No significant difference was seen in the mean time to optimal antimicrobial therapy (27.3±35.5 hr vs 19.4± 30 hr, p=0.265). Patients in the post-implementation group had a significantly higher mean hospitalization cost ($24,638.87± $11,080.91 vs $32,722.07±$13,076.73, p=0.013). There was no significant difference in time to effective antimicrobial therapy, in-hospital mortality, or length of hospital stay. Conclusion There were no between-group differences in the primary outcome of time to optimal therapy, with a higher mean hospitalization cost after implementation. These results suggest further antimicrobial stewardship interventions are needed, along with larger studies conducted in the community hospital settings. Disclosures All Authors: No reported disclosures


2006 ◽  
Vol 50 (10) ◽  
pp. 3355-3360 ◽  
Author(s):  
Kimberly K. Scarsi ◽  
Joe M. Feinglass ◽  
Marc H. Scheetz ◽  
Michael J. Postelnick ◽  
Maureen K. Bolon ◽  
...  

ABSTRACT The consequences of inactive empiric antimicrobial therapy are not well-described and may cause prolonged hospitalization or infection-related mortality. In vitro susceptibility results for 884 patients hospitalized at an academic medical center with gram-negative bloodstream infections (GNBI) from 2001 to 2003 were matched to antimicrobial orders within 24 h of culture. Clinical characteristics, organism, inpatient mortality, and length of stay after culture for patients with GNBI were compared between patients receiving active versus inactive empiric antimicrobial therapy. A total of 14.1% of patients with GNBI received inactive empiric therapy, defined as no antimicrobial therapy within 24 h of the culture active against the identified organism based on in vitro microbiology reports. Patients who received inactive therapy were more likely to be younger, to be infected with Pseudomonas aeruginosa, to have a nosocomial infection, and to receive antimicrobial monotherapy but less likely to be bacteremic with Escherichia coli or to have sepsis (P < 0.05). There were no significant differences in mortality between patients receiving active versus inactive empiric therapy (16.1% versus 13.6%, respectively) or in length of stay after positive culture (11.5 days versus 12.6 days, respectively). Only 45 patients had greater than 2 days of exposure to inactive therapy; however, 8/30 patients (26.7%) who never received active antimicrobial therapy died while in the hospital. Inactive empiric therapy was more common in healthier patients. Inactive antimicrobial therapy in the first 24 h did not significantly impact average outcomes for GNBI among hospitalized patients but may have caused harm to specific individuals.


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S735-S736
Author(s):  
Kimberly C Claeys ◽  
Teri Hopkins ◽  
Zegbeh Kpadeh-Rogers ◽  
Yunyun Jiang ◽  
Scott R Evans ◽  
...  

Abstract Background Rapid diagnostic tests (RDTs) for bloodstream infection (BSIs) are increasingly common. Decisions regarding which RDT to implement remains a clinical challenge given the diversity of organisms and resistance mechanisms detected by different platforms. The desirability of Outcome Ranking Management of Antimicrobial Therapy (DOOR-MAT) has been proposed as a framework to compare RDT platforms but reports of clinical application are lacking. This study compared potential antibiotic decisions based on results of two different RDTs for BSI using DOOR-MAT. Methods Retrospective study at University of Maryland Medical Center from August 2018 to April 2019 comparing Verigene® BC (VBC) to GenMark Dx ePlex® BCID for clinical blood cultures. VBC was part of standard of care, ePlex was run on discarded fresh or frozen blood samples. In this theoretical analysis, RDT result and local susceptibility data were applied by two Infectious Diseases pharmacists to make decisions regarding antibiotic selection in a blinded manner. Cohen’s Kappa statistic summarized overall agreement. DOOR-MAT, a partial credit scoring system, was applied to decisions based on final organism/susceptibility results (Figure 1). Scores were averaged between reviewers and mean scores compared between RDT systems using the t-test. Additionally, a sensitivity analysis with varied point assignment among Gram-negatives (AmpC-producers) was conducted. Results 110 clinical isolates were included; 41 Gram-negative, 69 Gram-positive organisms. Overall agreement was 82% for VBC and 83% for ePlex. The average score for VBC was 86.1 (SD 31.3) compared with ePlex 92.9 (SD 22.9), P = 0.004. Among Gram-negatives, the average score for VBC was 79.9 (SD 32.1) compared with ePlex 88.1 (SD 28.8), P = 0.032. Among GPs the average score for VBC was 89.9 (SD 30.4) compared with ePlex 95.8 (SD 18.3), P = 0.048. Sensitivity analysis demonstrated an average score for of 89.9 (SD 30.4) for VBC compared with 95.8 (SD 18.3) for ePlex, P = 0.27. Conclusion The use of a partial credit scoring system such as the DOOR-MAT allows for comparisons between RDT systems beyond sensitivity and specificity allowing for enhanced clinical interpretation. In this theoretical comparison, the Genmark ePlex BCID scored higher among both GP and GN organisms. Disclosures All authors: No reported disclosures.


Sign in / Sign up

Export Citation Format

Share Document