scholarly journals 1174. The Impact of Multidisciplinary Central Line Stewardship Program to Decrease CLABSI Rates and Central Line Utilization Rates in an Academic Urban Medical Center

2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S420-S421
Author(s):  
Isha Bhatt ◽  
Mohamed Nakeshbandi ◽  
Michael Augenbraun ◽  
Gwizdala Robert ◽  
Michael Lucchesi

Abstract Background Central Line-Associated Blood Stream Infections (CLABSI) is a major healthcare dilemma, contributing to increased morbidity, mortality, and costs. We sought to reduce rates of CLABSI and device utilization by implementing a multidisciplinary Central Line Stewardship Program (CLSP). Methods In July 2017, the CLSP, multidisciplinary quality improvement project, was implemented at an academic medical center to ensure proper indication for all CVCs in the hospital and removal when no longer indicated. A CLSP team of executive leaders and infection preventionists performed daily rounds on all CVCs to review indications and maintenance. Nursing staff reported all CVCs daily. Information Technology modified the electronic health record to require daily physician documentation of CVC placement and indications, and to suggest alternatives to CVC when possible. In the event of a CLABSI, a root cause analysis was conducted within 72 hours, and feedback was shared with the clinical staff. A retrospective review was conducted 18 months before and after CLSP implementation. As a facility in a state with mandatory reporting of hospital-acquired infections, institutional data were readily available through the National Healthcare Safety Network (NHSN). To compare rates of CLABSI and device utilization pre- and post-CLSP, we reviewed the Incidence Density Rate (IDR), the standardized infection ratio (SIR), and standardized utilization ratio (SUR). Data from the NHSN website were analyzed using statistical tools provided by the NHSN analysis module. Two-tailed significance tests were conducted with α set at 0.05. Results Post-CLSP, there was a statistically significant decrease in SIR from 1.99 to 0.885, with risk reduction by 44.3% (P = 0.013, 95% CI 0.226 -0.831). CLABSI IDR per 1000 CVC days declined from 1.84 to 0.886 (P = 0.0213). CVC utilization per 1000 patient-days reduced from 155.08 to 142.35 (P < 0.001). There was also a trend toward fewer PICC line infections post-intervention (17 to 5). Conclusion With this novel CLSP, we achieved a significant reduction in rates of CLABSI and device utilization, suggesting that a multidisciplinary approach can promote sustainable prevention of line-associated infections through dedicated surveillance of CVC indications and maintenance. Disclosures All authors: No reported disclosures.

2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S412-S412
Author(s):  
Bhagyashri D Navalkele ◽  
Nora Truhett ◽  
Miranda Ward ◽  
Sheila Fletcher

Abstract Background High regulatory burden on hospital-onset (HO) infections has increased performance pressure on infection prevention programs. Despite the availability of comprehensive prevention guidelines, a major challenge has been communication with frontline staff to integrate appropriate prevention measures into practice. The objective of our study was to evaluate the impact of educational intervention on HO CAUTI rates and urinary catheter days. Methods At the University of Mississippi Medical Center, Infection prevention (IP) reports unit-based monthly HO infections via email to respective unit managers and ordering physician providers. Starting May 2018, IP assessed compliance to CAUTI prevention strategies per SHEA/IDSA practice recommendations (2014). HO CAUTI cases with noncompliance were labeled as “preventable” infections and educational justification was provided in the email report. No other interventions were introduced during the study period. CAUTI data were collected using ongoing surveillance per NHSN and used to calculate rates per 1,000 catheter days. One-way analysis of variance (ANOVA) was used to compare pre- and post-intervention data. Results Prior to intervention (July 2017–March 2018), HO CAUTI rate was 1.43 per 1,000 catheter days. In the post-intervention period (July 2018–March 2019), HO CAUTI rate decreased to 0.62 per 1,000 catheter days. Comparison of pre- and post-intervention rates showed a statistically significant reduction in HO CAUTIs (P = 0.04). The total number of catheter days reduced, but the difference was not statistically significant (8,604 vs. 7,583; P = 0.06). Of the 14 HO CAUTIs in post-intervention period, 64% (8/14) were reported preventable. The preventable causes included inappropriate urine culturing practice in asymptomatic patients (5) or as part of pan-culture without urinalysis (2), and lack of daily catheter assessment for necessity (1). Conclusion At our institute, regular educational feedback by IP to frontline staff resulted in a reduction of HO CAUTIs. Feedback measure improved accountability, awareness and engagement of frontline staff in practicing appropriate CAUTI prevention strategies. Disclosures All authors: No reported disclosures.


2017 ◽  
Vol 4 (suppl_1) ◽  
pp. S89-S89 ◽  
Author(s):  
Gregory Cook ◽  
Shreena Advani ◽  
Saira Rab ◽  
Sheetal Kandiah ◽  
Manish Patel ◽  
...  

Abstract Background A candidemia treatment bundle (CTB) may increase adherence to guideline recommended candidemia management and improve patient outcomes. The purpose of this study was to evaluate the impact of a best practice alert (BPA) and order-set on optimizing compliance with all CTB components and patient outcomes. Methods A single center, pre-/post-intervention study was completed at Grady Health System from August 2015 to August 2017. Post-CTB intervention began August 2016. The CTB included a BPA that fires for blood cultures positive for any Candida species to treatment clinicians upon opening the patient’s electronic health record. The BPA included a linked order-set based on treatment recommendations including: infectious diseases (ID) and ophthalmology consultation, repeat blood cultures, empiric echinocandin therapy, early source control, antifungal de-escalation, intravenous to oral (IV to PO) switch, and duration of therapy. The primary outcome of the study was total adherence to the CTB. The secondary outcomes include adherence with the individual components of the CTB, 30-day mortality, and infection-related length of stay (LOS). Results Forty-five patients in the pre-group and 24 patients in the CTB group with candidemia were identified. Twenty-seven patients in the pre-group and 19 patients in the CTB group met inclusion criteria. Total adherence with the CTB occurred in one patient in the pre-group and threepatients in the CTB group (4% vs. 16%, P = 0.29). ID was consulted in 15 patients in the pre-group and 17 patients in the CTB group (56% vs. 89%, P = 0.02). Source control occurred in three and 11 patients, respectively (11% vs. 58% P &lt; 0.01). The bundle components of empiric echinocandin use (81% vs. 100%, P = 0.07), ophthalmology consultation (81% vs. 95%, P = 0.37), and IV to PO switch (22% vs. 32%, P = 0.5) also improved in the CTB group. Repeat cultures and antifungal de-escalation were similar among groups. Thirty-day mortality decreased in the CTB group by 10% (26% vs. 16%, P = 0.48). Median iLOS decreased from 30 days in the pre-group to 17 days in the CTB group (P = 0.05). Conclusion The CTB, with a BPA and linked order-set, improved guideline recommended management of candidemia specifically increasing the rates of ID consultation and early source control. There were quantitative improvements in mortality and iLOS. Disclosures All authors: No reported disclosures.


2020 ◽  
Vol 41 (S1) ◽  
pp. s256-s258
Author(s):  
Mary Kukla ◽  
Shannon Hunger ◽  
Tacia Bullard ◽  
Kristen Van Scoyoc ◽  
Mary Beth Hovda-Davis ◽  
...  

Background: Central-line–associated bloodstream infection (CLABSI) rates have steadily decreased as evidence-based prevention bundles were implemented. Bone marrow transplant (BMT) patients are at increased risk for CLABSI due to immunosuppression, prolonged central-line utilization, and frequent central-line accesses. We assessed the impact of an enhanced prevention bundle on BMT nonmucosal barrier injury CLABSI rates. Methods: The University of Iowa Hospitals & Clinics is an 811-bed academic medical center that houses the only BMT program in Iowa. During October 2018, we added 3 interventions to the ongoing CLABSI prevention bundle in our BMT inpatient unit: (1) a standardized 2-person dressing change team, (2) enhanced quality daily chlorhexidine treatments, and (3) staff and patient line-care stewardship. The bundle included training of nurse champions to execute a team approach to changing central-line dressings. Standard process description and supplies are contained in a cart. In addition, 2 sets of sterile hands and a second person to monitor for breaches in sterile procedure are available. Site disinfection with chlorhexidine scrub and dry time are monitored. Training on quality chlorhexidine bathing includes evaluation of preferred product, application per product instructions for use and protection of the central-line site with a waterproof shoulder length glove. In addition to routine BMT education, staff and patients are instructed on device stewardship during dressing changes. CLABSIs are monitored using NHSN definitions. We performed an interrupted time-series analysis to determine the impact of our enhanced prevention bundle on CLABSI rates in the BMT unit. We used monthly CLABSI rates since January 2017 until the intervention (October 2018) as baseline. Because the BMT changed locations in December 2018, we included both time points in our analysis. For a sensitivity analysis, we assessed the impact of the enhanced prevention bundle in a hematology-oncology unit (March 2019) that did not change locations. Results: During the period preceding bundle implementation, the CLABSI rate was 2.2 per 1,000 central-line days. After the intervention, the rate decreased to 0.6 CLABSI per 1,000 central-line days (P = .03). The move in unit location did not have a significant impact on CLABSI rates (P = .85). CLABSI rates also decreased from 1.6 per 1,000 central-line days to 0 per 1,000 central-line days (P < .01) in the hematology-oncology unit. Conclusions: An enhanced CLABSI prevention bundle was associated with significant decreases in CLABSI rates in 2 high-risk units. Novel infection prevention bundle elements should be considered for special populations when all other evidence-based recommendations have been implemented.Funding: NoneDisclosures: None


2021 ◽  
Vol 8 (Supplement_1) ◽  
pp. S656-S656
Author(s):  
Derek Evans ◽  
Mariana M Lanata Piazzon ◽  
Kaitlyn Schomburg

Abstract Background Hoop’s Family Children’s Hospital is a pediatric hospital with 72 beds, nested within Cabell Huntington Hospital. There is an established adult antibiotic stewardship program (ASP), however, since 2014 there has not been a pediatric infectious disease (ID) specialist and no pediatric ASP. With the recent hire of a pediatric ID specialist in Oct 2019 and the formation of a targeted pediatric ASP, we tracked the use of ceftriaxone (CRO) in our facility. Methods Starting January 2020, education was provided to pediatric providers in regards to appropriate CRO dosing and clinical indications via email communication. The main goals were to limit 100mg/kg/day dosing to severe infections and reduce CRO use in community-acquired pneumonia. This was sustained through intermittent prospective audits and feedback. A retrospective chart review was done from 2019-2021 for the months of January, April and December of each year. Patients ≤18 years of age who received CRO were included. Dosing, interval frequency, indication, and treatment duration were reviewed. Patients who received a single dose of CRO were excluded. Results From Jan 2019 – April 2021, 391 patient charts were reviewed (189 in the pre-intervention period and 202 in the post intervention period). There were no significant differences in age, race/ethnicity and gender in the two study groups. In the pre-intervention period, 86% of patients were prescribed CRO at severe infection dosing vs 33% in the post intervention period (p&lt; 0.0001) (Figure 1). When dosing was paired with indication, only 20% of patients in the pre intervention period had the appropriate dosing per clinical indication compared to 83% in the post intervention period (p&lt; 0.0001) (Figure 2). We also saw that in the pre-intervention period the most common indication for CRO was pneumonia (66%), which decreased to 57% in 2020 and to 35% in 2021 (p&lt; 0.0001) (Figure 3). Figure 1 describes the percentage of patients receiving ceftriaxone at severe infection dosing. This changed from an average of 86% in the pre-intervention period to 33% in the post-intervention period. Figure 2 describes the percentage of patients receiving ceftriaxone at the appropriate dosing dependent on the clinical indication provided. This changed from 20% in the pre-intervention period to closer to 90% in the post-intervention period. Conclusion Pediatric specific ASP efforts and expertise proved to be crucial in appropriate CRO use in our institution. With a feasible education strategy and targeted prospective audit and feedback, there has been a sustained impact in inappropriate CRO use. This underscores the importance of targeted pediatric ASP efforts in pediatric hospitals within larger adult hospitals. Disclosures All Authors: No reported disclosures


2019 ◽  
Vol 66 (1) ◽  
pp. 29-33
Author(s):  
Priyam Mithawala ◽  
Edo-abasi McGee

Objective The primary objectives were to evaluate the prescriber acceptance rate of Antimicrobial Stewardship Program (ASP) pharmacist recommendation to de-escalate/discontinue meropenem, and estimate the difference in duration of meropenem therapy. The secondary objective was to determine incidence of adverse events in the two groups. Methods It was a retrospective study. All patients admitted to Gwinnett Medical Center and receiving meropenem from January–November 2015 were included in the study. Exclusion criteria were: patients admitted to intensive care unit, one-time dose, infectious disease consultation, and age <18 years. Electronic medical records were reviewed for data collection. The control group consisted of patients from January–July 2015 when there was no ASP pharmacist. The intervention group consisted of patients from August–November 2015 during which period the ASP pharmacist recommended de-escalation/discontinuation of meropenem based on culture and sensitivity results. Results A total of 41 patients were studied, 21 in the control group and 20 in the intervention group. There was no significant difference in baseline characteristics in the two groups and in terms of prior hospitalization or antibiotic use (within 90 days) and documented or suspected MDRO infection at the time of admission. De-escalation/discontinuation was suggested in 16/20 patients in the intervention group (80%), and intervention was accepted in 68%. The mean duration of therapy was significantly decreased in the intervention group (5.6 days vs. 8.1 days, p =0.0175). Two patients had thrombocytopenia (unrelated to meropenem), and none of the patients had seizure. Conclusion Targeted antibiotic review is an effective ASP strategy, which significantly decreases the duration of meropenem therapy.


2018 ◽  
Vol 39 (07) ◽  
pp. 878-880 ◽  
Author(s):  
Sonali D. Advani ◽  
Rachael A. Lee ◽  
Martha Long ◽  
Mariann Schmitz ◽  
Bernard C. Camins

The 2015 changes in the catheter-associated urinary tract infection definition led to an increase in central line-associated bloodstream infections (CLABSIs) and catheter-related candidemia in some health systems due to the change in CLABSI attribution. However, our rates remained unchanged in 2015 and further declined in 2016 with the implementation of new vascular-access guidelines.Infect Control Hosp Epidemiol 2018;878–880


2019 ◽  
Vol 41 (1) ◽  
pp. 59-66 ◽  
Author(s):  
Shruti K. Gohil ◽  
Jennifer Yim ◽  
Kathleen Quan ◽  
Maurice Espinoza ◽  
Deborah J. Thompson ◽  
...  

AbstractObjective:To assess the impact of a newly developed Central-Line Insertion Site Assessment (CLISA) score on the incidence of local inflammation or infection for CLABSI prevention.Design:A pre- and postintervention, quasi-experimental quality improvement study.Setting and participants:Adult inpatients with central venous catheters (CVCs) hospitalized in an intensive care unit or oncology ward at a large academic medical center.Methods:We evaluated CLISA score impact on insertion site inflammation and infection (CLISA score of 2 or 3) incidence in the baseline period (June 2014–January 2015) and the intervention period (April 2015–October 2017) using interrupted times series and generalized linear mixed-effects multivariable analyses. These were run separately for days-to-line removal from identification of a CLISA score of 2 or 3. CLISA score interrater reliability and photo quiz results were evaluated.Results:Among 6,957 CVCs assessed 40,846 times, percentage of lines with CLISA score of 2 or 3 in the baseline and intervention periods decreased by 78.2% (from 22.0% to 4.7%), with a significant immediate decrease in the time-series analysis (P < .001). According to the multivariable regression, the intervention was associated with lower percentage of lines with a CLISA score of 2 or 3, after adjusting for age, gender, CVC body location, and hospital unit (odds ratio, 0.15; 95% confidence interval, 0.06–0.34; P < .001). According to the multivariate regression, days to removal of lines with CLISA score of 2 or 3 was 3.19 days faster after the intervention (P < .001). Also, line dwell time decreased 37.1% from a mean of 14 days (standard deviation [SD], 10.6) to 8.8 days (SD, 9.0) (P < .001). Device utilization ratios decreased 9% from 0.64 (SD, 0.08) to 0.58 (SD, 0.06) (P = .039).Conclusions:The CLISA score creates a common language for assessing line infection risk and successfully promotes high compliance with best practices in timely line removal.


2017 ◽  
Vol 4 (suppl_1) ◽  
pp. S398-S398 ◽  
Author(s):  
Werner Bischoff ◽  
Andrey Bubnov ◽  
Elizabeth Palavecino ◽  
James Beardsley ◽  
John Williamson ◽  
...  

Abstract Background Clostridium difficile infections (CDI) pose a growing threat to hospitalized patients. This study assesses the impact of changing from a nucleic acid amplification test (NAAT) to a stepwise testing algorithm (STA) by using an enzyme immunoassay (GDH and toxin A/B) and confirmatory NAAT confirmation in specific cases. Methods In an 885 bed academic medical center a 24 month pre-/post design was used to assess the effect of the STA for the following parameters: rates of enterocolitis due to C.diff (CDE), NHSN C.diff LabID events, CDI complications, mortality, antimicrobial prescription patterns, cluster occurrences; and testing, treatment, and isolation costs. Inpatient data were extracted from ICD-9/10 diagnosis codes, infection prevention, and laboratory databases. Results The STA significantly decreased the number of CDE ICD9/10 codes, HO, CO, and CO-HCFA C.diff LabID event rates by 65%, 78%, 75%, and 75%, respectively. Similar reductions were noted for associated complications such as NHSN defined colon surgeries (-61%), megacolon (-64%), and acute kidney failure (-55%). CDE unrelated complication rates for colon surgeries and acute kidney failure remained constant while the diagnosis of megacolon decreased but not significantly (-71%; P &gt; 0.05). Inpatient mortality did not change with or without CDE. Significant reductions were observed in the use of oral metronidazole (total: -32%; CDE specific: -70%) and vancomycin (total: -58%; CDE specific: -61%). There were no clusters detected pre-/post STA introduction. The need for isolation decreased from 748 to 181 patients post-intervention (-76%; P &lt; 0.05). Annual cost savings were over $175,000 due to decreases in laboratory testing followed by isolation, and antibiotic use. Conclusion The switch to an STA from NAAT did not affect the diagnosis, treatment, or control of clinically relevant CDI in our institution. Benefits included avoidance of unnecessary antibiotic treatment, reduction in isolation, achieving publicly reported objectives, and costs savings. Selection of clinically relevant tests can help to improve hospitalization and treatment of patients and should be considered as part of diagnostic stewardship. Disclosures All authors: No reported disclosures.


2016 ◽  
Vol 37 (11) ◽  
pp. 1361-1366 ◽  
Author(s):  
Elizabeth A. Neuner ◽  
Andrea M. Pallotta ◽  
Simon W. Lam ◽  
David Stowe ◽  
Steven M. Gordon ◽  
...  

OBJECTIVETo describe the impact of rapid diagnostic microarray technology and antimicrobial stewardship for patients with Gram-positive blood cultures.DESIGNRetrospective pre-intervention/post-intervention study.SETTINGA 1,200-bed academic medical center.PATIENTSInpatients with blood cultures positive for Staphylococcus aureus, Enterococcus faecalis, E. faecium, Streptococcus pneumoniae, S. pyogenes, S. agalactiae, S. anginosus, Streptococcus spp., and Listeria monocytogenes during the 6 months before and after implementation of Verigene Gram-positive blood culture microarray (BC-GP) with an antimicrobial stewardship intervention.METHODSBefore the intervention, no rapid diagnostic technology was used or antimicrobial stewardship intervention was undertaken, except for the use of peptide nucleic acid fluorescent in situ hybridization and MRSA agar to identify staphylococcal isolates. After the intervention, all Gram-positive blood cultures underwent BC-GP microarray and the antimicrobial stewardship intervention consisting of real-time notification and pharmacist review.RESULTSIn total, 513 patients with bacteremia were included in this study: 280 patients with S. aureus, 150 patients with enterococci, 82 patients with stretococci, and 1 patient with L. monocytogenes. The number of antimicrobial switches was similar in the pre–BC-GP (52%; 155 of 300) and post–BC-GP (50%; 107 of 213) periods. The time to antimicrobial switch was significantly shorter in the post–BC-GP group than in the pre–BC-GP group: 48±41 hours versus 75±46 hours, respectively (P<.001). The most common antimicrobial switch was de-escalation and time to de-escalation, was significantly shorter in the post-BC-GP group than in the pre–BC-GP group: 53±41 hours versus 82±48 hours, respectively (P<.001). There was no difference in mortality or hospital length of stay as a result of the intervention.CONCLUSIONSThe combination of a rapid microarray diagnostic test with an antimicrobial stewardship intervention improved time to antimicrobial switch, especially time to de-escalation to optimal therapy, in patients with Gram-positive blood cultures.Infect Control Hosp Epidemiol 2016;1–6


Sign in / Sign up

Export Citation Format

Share Document