scholarly journals 823. How to Compare standardized Healthcare-associated Infection (HAI) Rates? Benchmark 2D and 3D

2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S453-S454
Author(s):  
Braulio Roberto Gonçalves Marinho Couto ◽  
Carlos Ernesto Ferreira Starling

Abstract Background External benchmarking involves comparing standardized data on HAI rates in one hospital or healthcare facility in relation to others. Here we present two epidemiological graphical tools, 2D and 3D benchmarks, which summarize the efficiency in preventing main infections in a Medical/Surgical Intensive Care Unit (MSICU). Methods The 3D benchmark graph considers the incidence density rate of ventilator-associated pneumonias (VAP cases per 1,000 ventilator-days) as the X-Axis, the incidence density rate of central line-associated primary bloodstream infections (CLABSI cases per 1,000 central line-days) as the Y-Axis, and the incidence density rate of urinary catheter-associated urinary tract infections (CAUTI per 1,000 urinary catheter-days) as the Z-Axis. Efficiency in preventing infection (e) considers the zero rate to be 100% efficient (e=100%) and the highest available benchmark rate to be “zero” efficiency (RMax: e=0%). From this definition, the efficiency of any MSICU (0% ≤ e ≤ 100%) is obtained using a linear interpolation function, from the rate observed in the MSICU under evaluation (Rx): e = 100x(RMax – Rx)/RMax. If Rx > RMax, then RMax = Rx. The 3D benchmark is build by calculating the preventing infection (e) for each infection (VAP, CLABSI, and CAUTI) for all benchmarks and for the MSICU under evaluation. In the 3D Benchmark, three control volumes are created: “Infection Control Urgency” volume, “Infection Control Excellence” volume, “Infection Prevention Opportunity” volume. Benchmark 2D considers only the VAP density rate as X-Axis, and the CLABSI density rate as Y-Axis. In this graph, five control regions are created: 1=excellence in the control of VAP+CLABSI; 2=excellence in VAP control and opportunity for CLABSI prevention; 3=excellence in CLABSI control and opportunity to prevent VAP; 4=opportunity to prevent VAP+CLABSI; 5=urgency in infection control. Results Graph parameters were based on NHSN data from the device-associated module, NOIS Project, Anahp, CQH, and GVIMS/GGTES/ANVISA (Brazilian benchmarks), and El-Saed et al. benchmarks. We applied the 2D/3D benchmarks to several Brazilian ICUs. 2D benchmark for the MSICUs from Lifecenter Hospital, Brazil, Jan-Dez/2019: UCO & UTI 19 =excellence in CLABSI control and opportunity to prevent VAP; UTI 20=excellence in VAP control and opportunity for CLABSI prevention; UTI 18=opportunity to prevent VAP+CLABSI. 3D benchmark for the MSICUs from Lifecenter Hospital, Brazil, Jan-Dez/2019 2D benchmark for the MSICUs from Vera Cruz Hospital, Brazil, Jan-Dez/2019: CTI 3.o Andar =excellence in the control of VAP+CLABSI; CTI 1.o Andar=excellence in VAP control and opportunity for CLABSI prevention. Conclusion 2D and 3D benchmarks are easy to understand and summarize the efficiency in prevention the mains infections of MSICU. Disclosures All Authors: No reported disclosures

2007 ◽  
Vol 28 (11) ◽  
pp. 1247-1254 ◽  
Author(s):  
Lisa S. Young ◽  
Allison L. Sabel ◽  
Connie S. Price

Objectives.To determine risk factors for acquisition of multidrug-resistant (MDR)Acinetobacter baumanniiinfection during an outbreak, to describe the clinical manifestations of infection, and to ascertain the cost of infection.Design.Case-control study.Setting.Surgical intensive care unit in a 400-bed urban teaching hospital and level 1 trauma center.Patients.Case patients received a diagnosis of infection due toA. baumanniiisolates with a unique pattern of drug resistance (ie, susceptible to imipenem, variably susceptible to aminoglycosides, and resistant to all other antibiotics) between December 1, 2004, and August 31, 2005. Case patients were matched 1 : 1 with concurrently hospitalized control patients. Isolates' genetic relatedness was established by pulsed-field gel electrophoresis.Results.Sixty-seven patients met the inclusion criteria. Case and control patients were similar with respect to age, duration of hospitalization, and Charlson comorbidity score. MDRA. baumanniiinfections included ventilator-associated pneumonia (in 56.7% of patients), bacteremia (in 25.4%), postoperative wound infections (in 25.4%), central venous catheter-associated infections (in 20.9%), and urinary tract infections (in 10.4%). Conditional multiple logistic regression was used to determine statistically significant risk factors on the basis of results from the bivariate analyses. The duration of hospitalization and healthcare charges were modeled by multiple linear regression. Significant risk factors included higher Acute Physiology and Chronic Health Evaluation II score (odds ratio [OR], 1.1 per point increase;P= .06), duration of intubation (OR, 1.4 per day intubated;P<.01), exposure to bronchoscopy (OR, 22.7;P= .03), presence of chronic pulmonary disease (OR, 77.7;P= .02), receipt of fluconazole (OR, 73.3;P<.01), and receipt of levofloxacin (OR, 11.5;P= .02). Case patients had a mean of $60,913 in attributable excess patient charges and a mean of 13 excess hospital days.Interventions.Infection control measures included the following: limitations on the performance of pulsatile lavage wound debridement, the removal of items with upholstered surfaces, and the implementation of contact isolation for patients with suspected MDRA. baumanniiinfection.Conclusions.This large outbreak of infection due to clonal MDRA. baumanniicaused significant morbidity and expense. Aerosolization of MDRA. baumanniiduring pulsatile lavage debridement of infected wounds and during the management of respiratory secretions from colonized and infected patients may promote widespread environmental contamination. Multifaceted infection control interventions were associated with a decrease in the number of MDRA. baumanniiisolates recovered from patients.


2013 ◽  
Vol 34 (9) ◽  
pp. 893-899 ◽  
Author(s):  
Ryan P. Fagan ◽  
Jonathan R. Edwards ◽  
Benjamin J. Park ◽  
Scott K. Fridkin ◽  
Shelley S. Magill

Objective.To quantify historical trends in rates of central line-associated bloodstream infections (CLABSIs) in US intensive care units (ICUs) caused by major pathogen groups, includingCandidaspp.,Enterococcusspp., specified gram-negative rods, andStaphylococcus aureus.Design.Active surveillance in a cohort of participating ICUs through the Centers for Disease Control and Prevention, the National Nosocomial Infections Surveillance system during 1990–2004, and the National Healthcare Safety Network during 2006–2010.Setting.ICUS.Participants.Patients who were admitted to participating ICUs.Results.The CLABSI incidence density rate forS. aureusdecreased annually starting in 2002 and remained lower than for other pathogen groups. Since 2006, the annual decrease forS. aureusCLABSIs in nonpediatric ICU types was −18.3% (95% confidence interval [CI], −20.8% to −15.8%), whereas the incidence density rate forS. aureusamong pediatric ICUs did not change. The annual decrease for all ICUs combined since 2006 was −17.8% (95% CI, −19.4% to −16.1%) forEnterococcusspp., −16.4% (95% CI, −18.2% to −14.7%) for gram-negative rods, and −13.5% (95% CI, −15.4% to −11.5%) forCandidaspp.Conclusions.Patterns of ICU CLABSI incidence density rates among major pathogen groups have changed considerably during recent decades. CLABSI incidence declined steeply since 2006, except for CLABSI due toS. aureusin pediatric ICUs. There is a need to better understand CLABSIs that still do occur, on the basis of microbiological and patient characteristics. New prevention approaches may be needed in addition to central line insertion and maintenance practices.


2006 ◽  
Vol 27 (1) ◽  
pp. 54-59 ◽  
Author(s):  
Máxima Lizán-Garcia ◽  
Ramón Peyro ◽  
Manuel Cortiña ◽  
María Dolores Crespo ◽  
Aurelio Tobias

Objective.To establish the occurrence, distribution, and secular time trend of nosocomial infections (NIs) in a surgical intensive care unit (ICU).Design and Setting.Follow-up study in a teaching hospital in Spain.Methods.In May 1995 we established an nosocomial infection surveillance system in our surgical ICU. We collected information daily for all patients who were in the ICU for at least 48 hours (546 patients from 1996 through 2000). We used the Centers for Disease Control and Prevention definitions and criteria for infections. Monthly, we determined the site-specific incidence densities of NIs, the rates of medical device use, and the Poisson probability distribution, which determined whether the case count equalled the number of expected cases (the mean number of cases during the previous year, with extreme values excluded). We compared yearly and monthly infection rates by Poisson regression, using site-specific NIs as a dependent variable and year and month as dummy variables. We tested annual trends with an alternative Poisson regression model fitting a single linear trend.Results.The average rate of catheter-associated urinary tract infections was 8.4 per 1000 catheter-days; that of ventilator-associated pneumonia, 21 per 1000 ventilator-days; and that of central line–associated bloodstream infections, 30 per 1000 central line–days. The rate of urinary tract infections did not change over the study period, but there was a trend toward decreases in the rates of central line–associated bloodstream infections and ventilator-associated pneumonia.Conclusion.An NI surveillance and control program contributed to a progressive decrease in NI rates.


Author(s):  
Mohamad G. Fakih ◽  
Angelo Bufalino ◽  
Lisa Sturm ◽  
Ren-Huai Huang ◽  
Allison Ottenbacher ◽  
...  

Abstract Background: The coronavirus disease 2019 (COVID-19) pandemic has had a considerable impact on US hospitalizations, affecting processes and patient population. Methods: We evaluated the impact of COVID-19 pandemic in 78 US hospitals on central line associated bloodstream infections (CLABSI) and catheter associated urinary tract infections (CAUTI) events 12 months pre-COVID-19 and 6 months during COVID-19 pandemic. Results: There were 795,022 central line-days and 817,267 urinary catheter-days over the two study periods. Compared to pre-COVID-19 period, CLABSI rates increased during the pandemic period from 0.56 to 0.85 (51.0%) per 1,000 line-days (p<0.001) and from 1.00 to 1.64 (62.9%) per 10,000 patient-days (p<0.001). Hospitals with monthly COVID-19 patients representing >10% of admissions had a NHSN device standardized infection ratio for CLABSI that was 2.38 times higher compared to those with <5% prevalence during the pandemic period (p=0.004). Coagulase-negative staphylococcus CLABSI increased by 130% from 0.07 to 0.17 events per 1,000 line-days (p<0.001), and Candida sp. by 56.9% from 0.14 to 0.21 per 1,000 line-days (p=0.01). In contrast, no significant changes were identified for CAUTI (0.86 vs. 0.77 per 1,000 catheter-days; p=0.19). Conclusions: The COVID-19 pandemic was associated with substantial increases in CLABSI but not CAUTI events. Our findings underscore the importance of hardwiring processes for optimal line care, and regular feedback on performance to maintain a safe environment.


2005 ◽  
Vol 26 (1) ◽  
pp. 63-68 ◽  
Author(s):  
Abdul Qavi ◽  
Sorana Segal-Maurer ◽  
Noriel Mariano ◽  
Carl Urban ◽  
Carl Rosenberg ◽  
...  

AbstractObjectives:To determine risk factors for ceftazidime-resistantKlebsiella pneumoniaeinfection and the effect of cef-tazidime-resistantK. pneumoniaeinfection on mortality during an isolated outbreak.Design:Case–control investigation using clinical and molecular epidemiology and prospective analysis of infection control interventions.Setting:Surgical intensive care unit of a university-affiliated community hospital.Patients:Fourteen case-patients infected with ceftazidime-resistantK. pneumoniaeand 14 control-patients.Results:Ten of 14 case-patients had identical strains by pulsed-field gel electrophoresis. Broad-spectrum antibiotic therapy before admission to the unit was strongly predictive of subsequent ceftazidime-resistantK. pneumoniaeinfection. In addition, patients with ceftazidime-resistantK. pneumoniaeinfection experienced increased mortality (odds ratio, 3.77).Conclusions:Cephalosporin restriction has been shown to decrease the incidence of nosocomial ceftazidime-resistantK. pneumoniae. However, isolated clonal outbreaks may occur due to lapses in infection control practices. Reinstatement of strict handwashing, thorough environmental cleaning, and repeat education led to termination of the outbreak. A distinct correlation between ceftazidime-resistantK. pneumoniaeinfection and mortality supports the important influence of antibiotic resistance on the outcome of serious bacterial infections.


2015 ◽  
Vol 20 (1) ◽  
pp. 8-13
Author(s):  
O. A Orlova ◽  
V. G Akimkin

Rationale The relevance of ventilator-associated respiratory tract infections in severe injury patients (SIP) is associated with both features of causative pathogens and the initial severity of the state of patients. Among causative pathogens nosocomial flora is dominant. Purpose - to perform an analysis of the microbiological monitoring of ventilator-associated respiratory tract infections in SIP. The analysis was based on the results of a prospective epidemiological, clinical, and instrumental study of 100 SIP with ventilator-associated respiratory tract infections, stayed in the surgical intensive care unit. The proportion of ventilator-associated respiratory infections in the structure of nosocomial infections in these patients is between 90 - 95%, at that there was revealed the prevalence of nosocomial pneumonia (61%). Ventilator-associated respiratory tract infection most commonly occurs during the first 10 days of mechanical ventilation. The prevailing flora was represented by Gram negative Acinetobacter baumamnnii (40.3 ± 2.1%) and Pseudomonas aeruginosa (38.4% ± 3.2%). Isolated microorganisms possessed multiple antibiotic resistance, with the greatest extent to aminoglycosides - 69.5%, fluoroquinolones - 40.3%, penicillin - 37.6%; cephalosporins (third generation) - 33.8%. There is noted marked preponderance of microbial associations compared with monocultures 57.1 ± 5.3%.


2016 ◽  
Vol 2016 ◽  
pp. 1-7 ◽  
Author(s):  
Thomas R. Tucker ◽  
Sharif S. Aly ◽  
John Maas ◽  
Josh S. Davy ◽  
Janet E. Foley

Recent observations by stakeholders suggested that ecosystem changes may be driving an increased incidence of bovine erythrocytic anaplasmosis, resulting in a reemerging cattle disease in California. The objective of this prospective cohort study was to estimate the incidence ofAnaplasma marginaleinfection using seroconversion in a northern California beef cattle herd. A total of 143 Black Angus cattle (106 prebreeding heifers and 37 cows) were enrolled in the study. Serum samples were collected to determineAnaplasma marginaleseroprevalence using a commercially available competitive enzyme-linked immunosorbent assay test kit. Repeat sampling was performed in seronegative animals to determine the incidence density rate from March through September (2013). Seroprevalence of heifers was significantly lower than that of cows at the beginning of the study (P<0.001) but not at study completion (P=0.075). Incidence density rate ofAnaplasma marginaleinfection was 8.17 (95% confidence interval: 6.04, 10.81) cases per 1000 cow-days during the study period. Study cattle becameAnaplasma marginaleseropositive and likely carriers protected from severe clinical disease that might have occurred had they been first infected as mature adults. No evidence was found within this herd to suggest increased risk for clinical bovine erythrocytic anaplasmosis.


Sign in / Sign up

Export Citation Format

Share Document