scholarly journals Impact of an Enhanced Prevention Bundle on Central-Line–Associated Bloodstream Infection Incidence in Adult Oncology Units

2020 ◽  
Vol 41 (S1) ◽  
pp. s256-s258
Author(s):  
Mary Kukla ◽  
Shannon Hunger ◽  
Tacia Bullard ◽  
Kristen Van Scoyoc ◽  
Mary Beth Hovda-Davis ◽  
...  

Background: Central-line–associated bloodstream infection (CLABSI) rates have steadily decreased as evidence-based prevention bundles were implemented. Bone marrow transplant (BMT) patients are at increased risk for CLABSI due to immunosuppression, prolonged central-line utilization, and frequent central-line accesses. We assessed the impact of an enhanced prevention bundle on BMT nonmucosal barrier injury CLABSI rates. Methods: The University of Iowa Hospitals & Clinics is an 811-bed academic medical center that houses the only BMT program in Iowa. During October 2018, we added 3 interventions to the ongoing CLABSI prevention bundle in our BMT inpatient unit: (1) a standardized 2-person dressing change team, (2) enhanced quality daily chlorhexidine treatments, and (3) staff and patient line-care stewardship. The bundle included training of nurse champions to execute a team approach to changing central-line dressings. Standard process description and supplies are contained in a cart. In addition, 2 sets of sterile hands and a second person to monitor for breaches in sterile procedure are available. Site disinfection with chlorhexidine scrub and dry time are monitored. Training on quality chlorhexidine bathing includes evaluation of preferred product, application per product instructions for use and protection of the central-line site with a waterproof shoulder length glove. In addition to routine BMT education, staff and patients are instructed on device stewardship during dressing changes. CLABSIs are monitored using NHSN definitions. We performed an interrupted time-series analysis to determine the impact of our enhanced prevention bundle on CLABSI rates in the BMT unit. We used monthly CLABSI rates since January 2017 until the intervention (October 2018) as baseline. Because the BMT changed locations in December 2018, we included both time points in our analysis. For a sensitivity analysis, we assessed the impact of the enhanced prevention bundle in a hematology-oncology unit (March 2019) that did not change locations. Results: During the period preceding bundle implementation, the CLABSI rate was 2.2 per 1,000 central-line days. After the intervention, the rate decreased to 0.6 CLABSI per 1,000 central-line days (P = .03). The move in unit location did not have a significant impact on CLABSI rates (P = .85). CLABSI rates also decreased from 1.6 per 1,000 central-line days to 0 per 1,000 central-line days (P < .01) in the hematology-oncology unit. Conclusions: An enhanced CLABSI prevention bundle was associated with significant decreases in CLABSI rates in 2 high-risk units. Novel infection prevention bundle elements should be considered for special populations when all other evidence-based recommendations have been implemented.Funding: NoneDisclosures: None

2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S769-S769
Author(s):  
Elisabeth Caulder ◽  
Elizabeth Palavecino ◽  
James Beardsley ◽  
James Johnson ◽  
Vera Luther ◽  
...  

Abstract Background Vancomycin-resistant Enterococcus (VRE) bloodstream infection (BSI) is a significant cause of morbidity and mortality in immunocompromised patients. This study aimed to assess the impact of daptomycin (DAP) MIC on outcomes of treatment for VRE BSI in neutropenic oncology patients. Methods This was a retrospective, observational, single-center, cohort study at an academic medical center. Included: age ≥ 18, neutropenia, admitted to oncology unit, and DAP for VRE BSI. Excluded: death within 24 hours after initiation of DAP, polymicrobial BSI, and linezolid use for > 48 hours before DAP initiation. Patients with VRE BSI 2008–2018 were identified using a report from the micro lab. Data were collected by electronic medical record review. The primary outcome of the study was clinical success, defined as culture sterilization, hypotension resolution, defervescence, and no need to change DAP due to persistent signs/symptoms of infection. Patients were analyzed according to DAP MIC ≤ 2 vs. ≥ 4 mg/L. Multivariable logistic regression analysis was performed to identify factors associated with clinical success. Results 44 patients met study criteria (MIC ≤ 2, n = 26; MIC ≥ 4, n = 18). Mean age was 58 years, 59% were male, and median ANC was 0. Median Charlson Comorbidity Index Score and Pitt Bacteremia Score (Pitt) were 5 and 1, respectively. 34% required ICU admission. More patients achieved clinical success with MIC ≤ 2 (88% vs. 56%; P = 0.03). Time to success (2.4 vs. 4 days, P = 0.02) and time to culture sterilization (2.2 vs. 2.9 days, P = 0.24) were shorter with MIC ≤ 2. Mortality was similar between groups (31% vs. 33%). Time to culture sterilization (P = 0.008), neutropenia resolution (P = 0.02), MIC group (P = 0.096), and Pitt (P = 0.52) were included in the multivariable model. Conclusion DAP MIC should be considered when choosing therapy for VRE BSI among neutropenic oncology patients, particularly those expected to have prolonged neutropenia and those with persistently positive cultures. Disclosures All authors: No reported disclosures.


2018 ◽  
Vol 39 (07) ◽  
pp. 878-880 ◽  
Author(s):  
Sonali D. Advani ◽  
Rachael A. Lee ◽  
Martha Long ◽  
Mariann Schmitz ◽  
Bernard C. Camins

The 2015 changes in the catheter-associated urinary tract infection definition led to an increase in central line-associated bloodstream infections (CLABSIs) and catheter-related candidemia in some health systems due to the change in CLABSI attribution. However, our rates remained unchanged in 2015 and further declined in 2016 with the implementation of new vascular-access guidelines.Infect Control Hosp Epidemiol 2018;878–880


2019 ◽  
Vol 40 (9) ◽  
pp. 1056-1058
Author(s):  
Jacob W. Pierce ◽  
Andrew Kirk ◽  
Kimberly B. Lee ◽  
John D. Markley ◽  
Amy Pakyz ◽  
...  

AbstractAntipseudomonal carbapenems are an important target for antimicrobial stewardship programs. We evaluated the impact of formulary restriction and preauthorization on relative carbapenem use for medical and surgical intensive care units at a large, urban academic medical center using interrupted time-series analysis.


2011 ◽  
Vol 32 (1) ◽  
pp. 50-58 ◽  
Author(s):  
Dennis G. Maki ◽  
Victor D. Rosenthal ◽  
Reinaldo Salomao ◽  
Fabio Franzetti ◽  
Manuel Sigfrido Rangel-Frausto

Background.We report a meta-analysis of 4 identical time-series cohort studies of the impact of switching from use of open infusion containers (glass bottle, burette, or semirigid plastic bottle) to closed infusion containers (fully collapsible plastic containers) on central line-associated bloodstream infection (CLABSI) rates and all-cause intensive care unit (ICU) mortality in 15 adult ICUs in Argentina, Brazil, Italy, and Mexico.Methods.All ICUs used open infusion containers for 6–12 months, followed by switching to closed containers. Patient characteristics, adherence to infection control practices, CLABSI rates, and ICU mortality during the 2 periods were compared by χ2test for each country, and the results were combined using meta-analysis.Results.Similar numbers of patients participated in 2 periods (2,237 and 2,136). Patients in each period had comparable Average Severity of Illness Scores, risk factors for CLABSI, hand hygiene adherence, central line care, and mean duration of central line placement. CLABSI incidence dropped markedly in all 4 countries after switching from an open to a closed infusion container (pooled results, from 10.1 to 3.3 CLABSIs per 1,000 central line-days; relative risk [RR], 0.33 [95% confidence interval {CI}, 0.24-0.46];P<.001). All-cause ICU mortality also decreased significantiy, from 22.0 to 16.9 deaths per 100 patients (RR, 0.77 [95% CI, 0.68-0.87];P<.001).Conclusions.Switching from an open to a closed infusion container resulted in a striking reduction in the overall CLABSI incidence and all-cause ICU mortality. Data suggest that open infusion containers are associated with a greatiy increased risk of infusion-related bloodstream infection and increased ICU mortality that have been unrecognized. Furthermore, data suggest CLABSIs are associated with significant attributable mortality.


2019 ◽  
Vol 41 (1) ◽  
pp. 59-66 ◽  
Author(s):  
Shruti K. Gohil ◽  
Jennifer Yim ◽  
Kathleen Quan ◽  
Maurice Espinoza ◽  
Deborah J. Thompson ◽  
...  

AbstractObjective:To assess the impact of a newly developed Central-Line Insertion Site Assessment (CLISA) score on the incidence of local inflammation or infection for CLABSI prevention.Design:A pre- and postintervention, quasi-experimental quality improvement study.Setting and participants:Adult inpatients with central venous catheters (CVCs) hospitalized in an intensive care unit or oncology ward at a large academic medical center.Methods:We evaluated CLISA score impact on insertion site inflammation and infection (CLISA score of 2 or 3) incidence in the baseline period (June 2014–January 2015) and the intervention period (April 2015–October 2017) using interrupted times series and generalized linear mixed-effects multivariable analyses. These were run separately for days-to-line removal from identification of a CLISA score of 2 or 3. CLISA score interrater reliability and photo quiz results were evaluated.Results:Among 6,957 CVCs assessed 40,846 times, percentage of lines with CLISA score of 2 or 3 in the baseline and intervention periods decreased by 78.2% (from 22.0% to 4.7%), with a significant immediate decrease in the time-series analysis (P < .001). According to the multivariable regression, the intervention was associated with lower percentage of lines with a CLISA score of 2 or 3, after adjusting for age, gender, CVC body location, and hospital unit (odds ratio, 0.15; 95% confidence interval, 0.06–0.34; P < .001). According to the multivariate regression, days to removal of lines with CLISA score of 2 or 3 was 3.19 days faster after the intervention (P < .001). Also, line dwell time decreased 37.1% from a mean of 14 days (standard deviation [SD], 10.6) to 8.8 days (SD, 9.0) (P < .001). Device utilization ratios decreased 9% from 0.64 (SD, 0.08) to 0.58 (SD, 0.06) (P = .039).Conclusions:The CLISA score creates a common language for assessing line infection risk and successfully promotes high compliance with best practices in timely line removal.


2020 ◽  
Vol 41 (10) ◽  
pp. 1142-1147
Author(s):  
Michelle E. Doll ◽  
Jinlei Zhao ◽  
Le Kang ◽  
Barry Rittmann ◽  
Michael Alvarez ◽  
...  

AbstractObjective:To assess the impact of major interventions targeting infection control and diagnostic stewardship in efforts to decrease Clostridioides difficile hospital onset rates over a 6-year period.Design:Interrupted time series.Setting:The study was conducted in an 865-bed academic medical center.Methods:Monthly hospital-onset C. difficile infection (HO-CDI) rates from January 2013 through January 2019 were analyzed around 5 major interventions: (1) a 2-step cleaning process in which an initial quaternary ammonium product was followed with 10% bleach for daily and terminal cleaning of rooms of patients who have tested positive for C. difficile (February 2014), (2) UV-C device for all terminal cleaning of rooms of C. difficile patients (August 2015), (3) “contact plus” isolation precautions (June 2016), (4) sporicidal peroxyacetic acid and hydrogen peroxide cleaning in all patient areas (June 2017), (5) electronic medical record (EMR) decision support tool to facilitate appropriate C. difficile test ordering (March 2018).Results:Environmental cleaning interventions and enhanced “contact plus” isolation did not impact HO-CDI rates. Diagnostic stewardship via EMR decision support decreased the HO-CDI rate by 6.7 per 10,000 patient days (P = .0079). When adjusting rates for test volume, the EMR decision support significance was reduced to a difference of 5.1 case reductions per 10,000 patient days (P = .0470).Conclusion:Multiple aggressively implemented infection control interventions targeting CDI demonstrated a disappointing impact on endemic CDI rates over 6 years. This study adds to existing data that outside of an outbreak situation, traditional infection control guidance for CDI prevention has little impact on endemic rates.


2018 ◽  
Vol 39 (6) ◽  
pp. 676-682 ◽  
Author(s):  
Gonzalo Bearman ◽  
Salma Abbas ◽  
Nadia Masroor ◽  
Kakotan Sanogo ◽  
Ginger Vanhoozer ◽  
...  

OBJECTIVETo investigate the impact of discontinuing contact precautions among patients infected or colonized with methicillin-resistantStaphylococcus aureus(MRSA) or vancomycin-resistantEnterococcus(VRE) on rates of healthcare-associated infection (HAI). DESIGN. Single-center, quasi-experimental study conducted between 2011 and 2016.METHODSWe employed an interrupted time series design to evaluate the impact of 7 horizontal infection prevention interventions across intensive care units (ICUs) and hospital wards at an 865-bed urban, academic medical center. These interventions included (1) implementation of a urinary catheter bundle in January 2011, (2) chlorhexidine gluconate (CHG) perineal care outside ICUs in June 2011, (3) hospital-wide CHG bathing outside of ICUs in March 2012, (4) discontinuation of contact precautions in April 2013 for MRSA and VRE, (5) assessments and feedback with bare below the elbows (BBE) and contact precautions in August 2014, (6) implementation of an ultraviolet-C disinfection robot in March 2015, and (7) 72-hour automatic urinary catheter discontinuation orders in March 2016. Segmented regression modeling was performed to assess the changes in the infection rates attributable to the interventions.RESULTSThe rate of HAI declined throughout the study period. Infection rates for MRSA and VRE decreased by 1.31 (P=.76) and 6.25 (P=.21) per 100,000 patient days, respectively, and the infection rate decreased by 2.44 per 10,000 patient days (P=.23) for device-associated HAI following discontinuation of contact precautions.CONCLUSIONThe discontinuation of contact precautions for patients infected or colonized with MRSA or VRE, when combined with horizontal infection prevention measures was not associated with an increased incidence of MRSA and VRE device-associated infections. This approach may represent a safe and cost-effective strategy for managing these patients.Infect Control Hosp Epidemiol2018;39:676–682


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S420-S421
Author(s):  
Isha Bhatt ◽  
Mohamed Nakeshbandi ◽  
Michael Augenbraun ◽  
Gwizdala Robert ◽  
Michael Lucchesi

Abstract Background Central Line-Associated Blood Stream Infections (CLABSI) is a major healthcare dilemma, contributing to increased morbidity, mortality, and costs. We sought to reduce rates of CLABSI and device utilization by implementing a multidisciplinary Central Line Stewardship Program (CLSP). Methods In July 2017, the CLSP, multidisciplinary quality improvement project, was implemented at an academic medical center to ensure proper indication for all CVCs in the hospital and removal when no longer indicated. A CLSP team of executive leaders and infection preventionists performed daily rounds on all CVCs to review indications and maintenance. Nursing staff reported all CVCs daily. Information Technology modified the electronic health record to require daily physician documentation of CVC placement and indications, and to suggest alternatives to CVC when possible. In the event of a CLABSI, a root cause analysis was conducted within 72 hours, and feedback was shared with the clinical staff. A retrospective review was conducted 18 months before and after CLSP implementation. As a facility in a state with mandatory reporting of hospital-acquired infections, institutional data were readily available through the National Healthcare Safety Network (NHSN). To compare rates of CLABSI and device utilization pre- and post-CLSP, we reviewed the Incidence Density Rate (IDR), the standardized infection ratio (SIR), and standardized utilization ratio (SUR). Data from the NHSN website were analyzed using statistical tools provided by the NHSN analysis module. Two-tailed significance tests were conducted with α set at 0.05. Results Post-CLSP, there was a statistically significant decrease in SIR from 1.99 to 0.885, with risk reduction by 44.3% (P = 0.013, 95% CI 0.226 -0.831). CLABSI IDR per 1000 CVC days declined from 1.84 to 0.886 (P = 0.0213). CVC utilization per 1000 patient-days reduced from 155.08 to 142.35 (P < 0.001). There was also a trend toward fewer PICC line infections post-intervention (17 to 5). Conclusion With this novel CLSP, we achieved a significant reduction in rates of CLABSI and device utilization, suggesting that a multidisciplinary approach can promote sustainable prevention of line-associated infections through dedicated surveillance of CVC indications and maintenance. Disclosures All authors: No reported disclosures.


Author(s):  
Vishal P. Shah ◽  
Laura E. Breeher ◽  
Julie M. Alleckson ◽  
David G. Rivers ◽  
Zhen Wang ◽  
...  

Abstract Objective: To assess the rate and factors associated with healthcare personnel (HCP) testing positive for SARS-CoV-2 after an occupational exposure Design: Retrospective cohort study Setting: Academic medical center with sites in Minnesota, Wisconsin, Arizona, and Florida Subjects: HCP with a high or medium risk occupational exposure to a patient or other HCP with SARS-CoV-2 Methods: We reviewed the records of HCP with significant occupational exposures from March 20th, 2020 through December 31st, 2020. We then performed regression analysis to assess the impact of demographic and occupational variables to assess their impact on the likelihood of testing positive for SARS-CoV-2 Results: A total of 2,253 confirmed occupational exposures occurred during the study period. Employees were the source for 57.1% of exposures. Overall, 101 (4.5%) HCP tested positive in the postexposure period. Of these, 80 had employee sources of exposure and 21 had patient sources of exposure. The post exposure infection rate was 6.2% when employees were the source, compared to 2.2% with patient sources. In a multivariate analysis, occupational exposure from an employee source had a higher risk of testing positive compared to a patient source (OR 3.22 95% CI (1.72-6.04)). Gender, age, high-risk exposure, and HCP role were not associated with increased risk of testing positive. Conclusions: The risk of acquiring COVID-19 following a significant occupational exposure is relatively low, even in the pre-vaccination era. Exposure to an infectious coworker carries a higher risk than exposure to a patient. Continued vigilance and precautions remain necessary in healthcare settings.


Sign in / Sign up

Export Citation Format

Share Document