scholarly journals 909. Reassessing Pathogens Eligible for the Centers for Disease Control and Prevention’s (CDC’s) National Healthcare Safety Network (NHSN) “Mucosal Barrier Injury-Laboratory Confirmed Bloodstream Infection” Criteria

2020 ◽  
Vol 7 (Supplement_1) ◽  
pp. S488-S489
Author(s):  
Nora Chea ◽  
Shelley Magill ◽  
Andrea L Benin ◽  
Katherine Allen-Bridson ◽  
Margaret Dudeck ◽  
...  

Abstract Background NHSN Mucosal Barrier Injury-Laboratory Confirmed Bloodstream Infection (MBI-LCBI) includes pathogens likely to cause bloodstream infections (BSI) in some oncology patients. MBI-LCBIs are excluded from central line-associated BSI (CLABSI) reporting to the Centers for Medicare & Medicaid Services. NHSN users have requested other pathogens be added to MBI-LCBI. To make decision, we compared CLABSI pathogen distributions in three NHSN patient location groups. Methods We analyzed CLABSI data from hospitals conducting surveillance for ≥ 1 month from January 2014–December 2018 in ≥ 1 MBI high-risk location (leukemia, lymphoma, and adult and pediatric hematopoietic stem cell transplant wards). We compared CLABSI pathogen distributions and rates in MBI high-risk locations to medium-risk (solid tumor, adult and pediatric general hematology-oncology wards) and low-risk locations (adult and pediatric medical, surgical, and medical-surgical wards), and used χ2 tests to compare percentages with statistical significance at P ≤ 0.05. Results Overall, 122 hospitals reported 23,578 CLABSIs and 12,961,921 central line (CL)-days (1.81 CLABSIs per 1,000 CL-days) (Table). Percentages of CLABSIs due to three MBI-LCBI pathogens (E. coli, E. faecium, Viridans streptococci) were significantly higher in high- versus low-risk locations, while for other MBI-LCBI pathogens (K. pneumoniae/oxytoca, E. faecalis, Candida spp., Enterobacter spp.) percentages were significantly lower in high-risk locations (Figure). For pathogens not currently in MBI-LCBI, coagulase-negative staphylococci caused similar percentages of CLABSIs across locations, S. aureus caused a significantly higher percentage of CLABSIs in low-risk locations, while PA caused a significantly higher percentage of CLABSIs in high-risk locations. Table CLABSIs attributed to MBI high-risk, medium-risk, and low-risk locations, NHSN, 2014–2018 Figure Percentages of top 10 pathogen-specific CLABSIs in MBI high-risk, medium-risk, and low-risk locations, NHSN, 2014–2018 Conclusion Differences in percentages of CLABSIs due to selected pathogens between MBI high-risk and low-risk locations are evident in NHSN data. Lower percentages of Klebsiella and Candida spp. in high-risk locations might be partially due to antimicrobial prophylaxis in oncology patients. Although PA caused a significantly higher percentage of CLABSIs in high-risk locations, the absolute difference was modest. Additional analyses are needed. Disclosures All Authors: No reported disclosures

2020 ◽  
Vol 41 (S1) ◽  
pp. s261-s263
Author(s):  
Renata Fagnani ◽  
Luis Gustavo Oliveira Cardoso ◽  
Christian Cruz Höfling ◽  
Elisa Teixeira Mendes ◽  
nio Trabasso ◽  
...  

Background: Bloodstream infection (BSI) is the most challenging conditions in patients who undergo hematopoietic stem cell transplantation (HSCT). These infections may be related to health care in cases of central-line–associated bloodstream infection (CLABSI) or to translocation secondary to mucosal barrier injury (MBI). In 2013, MBI surveillance was incorporated into the CDC NHSN. The aim was to increase the CLABSI diagnostic accuracy by proposing more effective preventive care measures. The objective of this study was to evaluate impact of the MBI surveillance on CLABSI incidence density in a Brazilian university hospital. Methods: The CLABSI incidence densities from the period before BMI surveillance (2007–2012) and the period after BMI surveillance was implemented (2013–2018) were analyzed and compared. Infections during the preintervention period were reclassified according to the MBI criterion to obtain an accurate CLABSI rate for the first period. The average incidence densities for the 2 periods were compared using the Student t test after testing for no autocorrelation (P > .05). Results: After reclassification, the preintervention period incidence density (10 infections per 1,000 patient days) was significantly higher than the postintervention period incidence density (6 infections per 1,000 patients day; P = .011) (Table 1). Therefore, the reclassification of nonpreventable infections (MBI) in the surveillance system made the diagnosis of CLABSI more specific. The hospital infection control service was able to introduce specific preventive measures related to the insertion and management of central lines in HSCT patient care. Conclusions: The MBI classification improved the CLABSI diagnosis, which upgraded central-line prevention measures, then contributed to the decrease of CLABSI rates in this high-risk population.Funding: NoneDisclosures: None


2015 ◽  
Vol 36 (12) ◽  
pp. 1401-1408 ◽  
Author(s):  
Mini Kamboj ◽  
Rachel Blair ◽  
Natalie Bell ◽  
Crystal Son ◽  
Yao-Ting Huang ◽  
...  

OBJECTIVEIn this study, we examined the impact of routine use of a passive disinfection cap for catheter hub decontamination in hematology–oncology patients.SETTINGA tertiary care cancer center in New York CityMETHODSIn this multiphase prospective study, we used 2 preintervention phases (P1 and P2) to establish surveillance and baseline rates followed by sequential introduction of disinfection caps on high-risk units (HRUs: hematologic malignancy wards, hematopoietic stem cell transplant units and intensive care units) (P3) and general oncology units (P4). Unit-specific and hospital-wide hospital-acquired central-line–associated bloodstream infection (HA-CLABSI) rates and blood culture contamination (BCC) with coagulase negative staphylococci (CONS) were measured.RESULTSImplementation of a passive disinfection cap resulted in a 34% decrease in hospital-wide HA-CLABSI rates (combined P1 and P2 baseline rate of 2.66–1.75 per 1,000 catheter days at the end of the study period). This reduction occurred only among high-risk patients and not among general oncology patients. In addition, the use of the passive disinfection cap resulted in decreases of 63% (HRUs) and 51% (general oncology units) in blood culture contamination, with an estimated reduction of 242 BCCs with CONS. The reductions in HA-CLABSI and BCC correspond to an estimated annual savings of $3.2 million in direct medical costs.CONCLUSIONRoutine use of disinfection caps is associated with decreased HA-CLABSI rates among high-risk hematology oncology patients and a reduction in blood culture contamination among all oncology patients.Infect. Control Hosp. Epidemiol. 2015;36(12):1401–1408


2020 ◽  
Vol 41 (S1) ◽  
pp. s499-s500
Author(s):  
William Dube ◽  
Jesse Jacob ◽  
Ziduo Zheng ◽  
Yijian Huang ◽  
Chad Robichaux ◽  
...  

Background: The NHSN methods for central-line–associated bloodstream infection (CLABSI) surveillance do not account for additive CLABSI risk of concurrent central lines. Past studies were small and modestly risk adjusted but quantified the risk to be ~2-fold. If the attributable risk is this high, facilities that serve high-acuity patients with medically indicated concurrent central-line use may disproportionally incur CMS payment penalties for having high CLABSI rates. We aimed to build evidence through analysis using improved risk adjustment of a multihospital CLABSI experience to influence NHSN CLABSI protocols to account for risks attributed to concurrent central lines. Methods: In a retrospective cohort of adult patients at 4 hospitals (range, 110–733 beds) from 2012 to 2017, we linked central-line data to patient encounter data (age, comorbidities, total parenteral nutrition, chemotherapy, CLABSI). Analysis was limited to patients with >2 central-line days, with either a single central line or concurrence of no more than 2 central lines where insertion and removal dates overlapped by >1 day. Propensity-score matching for likelihood of concurrence and conditional logistic regression modeling estimated the risk of CLABSI attributed to concurrence of >1 day. To evaluate in Cox proportional hazards regression of time to CLABSIs, we also analyzed patients as unique central-line episodes: low risk (ie, ports, dialysis central lines, or PICC) or high risk (ie, temporary or nontunneled) and single versus concurrent. Results: In total, 64,575 central lines were used in 50,254 encounters. Among these patients, 517 developed a CLABSI; 438 (85%) with a single central line and 74 (15%) with concurrence. Moreover, 4,657 (9%) patients had concurrence (range, 6%–14% by hospital); of these, 74 (2%) had CLABSI, compared to 71 of 7,864 propensity-matched controls (1%). Concurrence patients had a median of 17 NHSN central-line days and 21 total central-line days. In multivariate modeling, patients with more concurrence (>2 of 3 of concurrent central-line days) had an higher risk for CLABSI (adjusted risk ratio, 1.62; 95% CI, 1.1–2.3) compared to controls. In survival analysis, 14,610 concurrent central-line episodes were compared to 31,126 single low-risk central-line episodes; adjusting for comorbidity, total parenteral nutrition, and chemotherapy, the daily excess risk of CLABSI attributable to the concurrent central line was ~80% (hazard ratio 1.78 for 2 high-risk or 2 low-risk central lines; hazard ratio 1.80 for a mix of high- and low-risk central lines) (Fig. 1). Notably, the hazard ratio attributed to a single high-risk line compared to a low-risk line was 1.44 (95% CI, 1.13–1.84). Conclusions: Since a concurrent central line nearly doubles the risk for CLABSI compared to a single low-risk line, the CDC should modify NHSN methodology to better account for this risk.Funding: NoneDisclosures: Scott Fridkin reports that his spouse receives consulting fees from the vaccine industry.


BMJ Open ◽  
2021 ◽  
Vol 11 (2) ◽  
pp. e043837
Author(s):  
Usha Dutta ◽  
Anurag Sachan ◽  
Madhumita Premkumar ◽  
Tulika Gupta ◽  
Swapnajeet Sahoo ◽  
...  

ObjectivesHealthcare personnel (HCP) are at an increased risk of acquiring COVID-19 infection especially in resource-restricted healthcare settings, and return to homes unfit for self-isolation, making them apprehensive about COVID-19 duty and transmission risk to their families. We aimed at implementing a novel multidimensional HCP-centric evidence-based, dynamic policy with the objectives to reduce risk of HCP infection, ensure welfare and safety of the HCP and to improve willingness to accept and return to duty.SettingOur tertiary care university hospital, with 12 600 HCP, was divided into high-risk, medium-risk and low-risk zones. In the high-risk and medium-risk zones, we organised training, logistic support, postduty HCP welfare and collected feedback, and sent them home after they tested negative for COVID-19. We supervised use of appropriate personal protective equipment (PPE) and kept communication paperless.ParticipantsWe recruited willing low-risk HCP, aged <50 years, with no comorbidities to work in COVID-19 zones. Social distancing, hand hygiene and universal masking were advocated in the low-risk zone.ResultsBetween 31 March and 20 July 2020, we clinically screened 5553 outpatients, of whom 3012 (54.2%) were COVID-19 suspects managed in the medium-risk zone. Among them, 346 (11.4%) tested COVID-19 positive (57.2% male) and were managed in the high-risk zone with 19 (5.4%) deaths. One (0.08%) of the 1224 HCP in high-risk zone, 6 (0.62%) of 960 HCP in medium-risk zone and 23 (0.18%) of the 12 600 HCP in the low-risk zone tested positive at the end of shift. All the 30 COVID-19-positive HCP have since recovered. This HCP-centric policy resulted in low transmission rates (<1%), ensured satisfaction with training (92%), PPE (90.8%), medical and psychosocial support (79%) and improved acceptance of COVID-19 duty with 54.7% volunteering for re-deployment.ConclusionA multidimensional HCP-centric policy was effective in ensuring safety, satisfaction and welfare of HCP in a resource-poor setting and resulted in a willing workforce to fight the pandemic.


2018 ◽  
Vol 6 (5) ◽  
pp. 138-148
Author(s):  
Ine Fausayana ◽  
Weka Gusmiarty Abdullah ◽  
La Ode Dawid

The aim of this study was to analysis the risks of coconut products marketing in Kendari City. The results of this study described risk identification in three stage of marketing of coconut product, namely (a) Five risks identified at the stage of storaging; broken coconut fruit, unsold product, fire market, theft of coconut fruits, and market regulation; (b) Three risks identified at the stage of processing; broken coconut, coconut shell waste, and damage to processing facilities; and (c) Four risks identified at the stage of selling; unsold product, non-strategic selling locations, substitute goods, and competitors. Overall, the risk on coconut products marketing was mapped at low risk. High risk was more prevalent in the stage of processing, which was caused by the risk of coconut shell waste. While medium risk was more prevalent in the stage of storaging.


Sign in / Sign up

Export Citation Format

Share Document