Reducing Bloodstream Infection Risk in Central and Peripheral Intravenous Lines: Initial Data on Passive Intravenous Connector Disinfection

2014 ◽  
Vol 19 (2) ◽  
pp. 87-93 ◽  
Author(s):  
Michelle DeVries ◽  
Patricia S. Mancos ◽  
Mary J. Valentine

Abstract Background: Although few facilities focus on it, bloodstream infection (BSI) risk from peripheral intravenous catheters (PIVs) may exceed central line-related risk. Over a 6-year period, Methodist Hospitals substantially reduced BSIs in patients with central lines but not in patients with PIVs. A practice audit revealed deficiencies in manual disinfection of intravenous connectors, thereby increasing BSI risk. Methodist thus sought an engineered approach to hub disinfection that would compensate for variations in scrubbing technique. Methods: Our institution involved bedside nurses in choosing new hub disinfection technology. They selected 2 devices to trial: a disinfection cap that passively disinfects hubs with isopropyl alcohol and a device that friction-scrubs with isopropyl alcohol. After trying both, nurses selected the cap for use in the facility's 3 intensive care units. After no BSIs occurred during a 3-month span, we implemented the cap throughout the hospital for use on central venous catheters; peripherally inserted central catheters; and peripheral lines, including tubing and Y-sites. Results: Comparing the postintervention period (December 2011-August 2013) to the preintervention span (September 2009–May 2011), the BSI rate dropped 43% for PIVs, 50% for central lines, and 45% overall (PIVs + central lines). The central line and overall results are statistically significant. The PIV BSI rate drop is attributable to cap use alone because the cap was the only new intervention during the postimplementation period. The other infection reductions appear to be at least partly due to cap use. Conclusions: Our institution achieved substantial BSI reductions, some statistically significant, by applying a disinfection cap to both PIVs and central lines.

Author(s):  
Jennifer Meddings ◽  
Vineet Chopra ◽  
Sanjay Saint

Prevention of central line–associated bloodstream infection (CLABSI), while initially making great strides in 2003, has declined as use of peripherally inserted central catheters (PICCs) has grown tremendously over the past two decades. The convenience of a PICC has led to sicker patients being treated outside the intensive care unit, and there has been little recognition of a trade-off between benefits and risks after PICC placement. For these reasons, CLABSI prevention has become more challenging. This chapter describes the contents of an infection prevention bundle for CLABSI. In the case of CLABSI, the intervention outlines appropriate and inappropriate uses of central lines. Several new tools are discussed, which help doctors and nurses think through which device is most appropriate for any given patient.


2016 ◽  
Vol 29 (6) ◽  
pp. 373
Author(s):  
Jorge Rodrigues ◽  
Andrea Dias ◽  
Guiomar Oliveira ◽  
José Farela Neves

<p><strong>Introduction:</strong> To determine the central-line associated bloodstream infection rate after implementation of central venous catheter-care practice bundles and guidelines and to compare it with the previous central-line associated bloodstream infection rate.<br /><strong>Material and Methods:</strong> A prospective, longitudinal, observational descriptive study with an exploratory component was performed in a Pediatric Intensive Care Unit during five months. The universe was composed of every child admitted to Pediatric Intensive Care Unit who inserted a central venous catheter. A comparative study with historical controls was performed to evaluate the result of the intervention (group 1 <em>versus</em> group 2).<br /><strong>Results:</strong> Seventy five children were included, with a median age of 23 months: 22 (29.3%) newborns; 28 (37.3%) with recent surgery and 32 (43.8%) with underlying illness. A total of 105 central venous catheter were inserted, the majority a single central venous catheter (69.3%), with a mean duration of 6.8 ± 6.7 days. The most common type of central venous catheter was the short-term, non-tunneled central venous catheter (45.7%), while the subclavian and brachial flexure veins were the most frequent insertion sites (both 25.7%). There were no cases of central-line associated bloodstream infection reported during this study. Comparing with historical controls (group 1), both groups were similar regarding age, gender, department of origin and place of central venous catheter insertion. In the current study (group 2), the median length of stay was higher, while the mean duration of central venous catheter (excluding peripherally inserted central line) was similar in both groups. There were no statistical differences regarding central venous catheter caliber and number of lumens. Fewer children admitted to Pediatric Intensive Care Unit had central venous catheter inserted in group 2, with no significant difference between single or multiple central venous catheter.<br /><strong>Discussion:</strong> After multidimensional strategy implementation there was no reported central-line associated bloodstream infection<br /><strong>Conclusions:</strong> Efforts must be made to preserve the same degree of multidimensional prevention, in order to confirm the effective reduction of the central-line associated bloodstream infection rate and to allow its maintenance.</p>


2014 ◽  
Vol 36 (2) ◽  
pp. 214-216 ◽  
Author(s):  
Devin Callister ◽  
Pauline Limchaiyawat ◽  
Samantha J. Eells ◽  
Loren G. Miller

Little is known about central line–associated bloodstream infection risk factors in the bundle era. In our case-control investigation, we found that independent risk factors for central line–associated bloodstream infection at our center included the number of recent lab tests, catheter duration, and lack of hemodynamic monitoring as the insertion indication.Infect Control Hosp Epidemiol 2014;00(0): 1–3


2019 ◽  
Vol 40 (9) ◽  
pp. 1019-1023 ◽  
Author(s):  
Jesse Couk ◽  
Sheri Chernetsky Tejedor ◽  
James P. Steinberg ◽  
Chad Robichaux ◽  
Jesse T. Jacob

AbstractBackground:The current methodology for calculating central-line–associated bloodstream infection (CLABSI) rates, used for pay-for-performance measures, does not account for multiple concurrent central lines.Objective:To compare CLABSI rates using standard National Healthcare Safety Network (NHSN) denominators to rates accounting for multiple concurrent central lines.Design:Descriptive analysis and retrospective cohort analysis.Methods:We identified all adult patients with central lines at 2 academic medical centers over an 18-month period. CLABSI rates were calculated for intensive care units (ICUs) and non-ICUs using the standard NHSN methodology and denominator (a patient could only have 1 central-line day for a given patient day) and a modified denominator (number of central lines in 1 patient in 1 day count as number of line days). We also compared characteristics of patients with and without multiple concurrent central lines.Results:Among 18,521 hospital admissions, there were 156,574 central-line days and 239 CLABSIs (ICU, 105; non-ICU, 134). Our modified denominator reduced CLABSI rates by 25% in ICUs (1.95 vs 1.47 per 1,000 line days) and 6% (1.30 vs 1.22 per 1,000 line days) in non-ICUs. Patients with multiple concurrent central lines were more likely to be in an ICU, to have a longer admission, to have a dialysis catheter, and to have a CLABSI.Conclusions:Using the number of central lines as the denominator decreased CLABSI rates in ICUs by 25%. The presence of multiple concurrent central lines may be a marker of severity of illness. The risk of CLABSI per lumen of a central line is similar in ICUs compared to wards.


Sign in / Sign up

Export Citation Format

Share Document