1261Geographical variations in the incidence of CIED infection and infection prevention strategies: Update from the global WRAP-IT study

EP Europace ◽  
2020 ◽  
Vol 22 (Supplement_1) ◽  
Author(s):  
C Kennergren ◽  
J E Poole ◽  
B L Wilkoff ◽  
S Mittal ◽  
G R Corey ◽  
...  

Abstract Funding Acknowledgements Medtronic, Inc. Introduction Cardiac Implantable Electronic Device (CIED) infections lead to significant morbidity, mortality, and use of health care resources. There is variation in infection prevention strategies among centers, and it is not clear whether there is also variation in infection rates across different geographies. Recently, WRAP-IT, the largest global randomized trial to evaluate an infection reduction strategy, randomized 6,983 patients to receive an antibacterial envelope (treatment) vs. no envelope (control). The results demonstrated a significant reduction in major CIED infection with the TYRX antibiotic envelope (12-mo infection rate for envelope vs. control 0.7% and 1.2%, respectively; HR, 0.60; 95% [CI], 0.36 to 0.98; P = 0.04).  The purpose of this analysis is to assess geographical variations in patient characteristics, procedural routines, and infection rates. Methods The WRAP-IT study enrolled patients undergoing a CIED pocket revision, generator replacement, or system upgrade or an initial implantation of a cardiac resynchronization therapy defibrillator and randomized them to receive the envelope or not, in addition to mandated pre-procedure intravenous antibiotic prophylaxis. To assess geographical variations in infection rates, the control group (per protocol) baseline demographics and procedural characteristics were identified. Major infection was defined as CIED infections resulting in system extraction or revision, long-term antibiotic therapy with infection recurrence, or death. Results A total of 3429 control patients were evaluated and followed for a mean of 20.9 ± 8.3 months; 2530 patients from 123 centers in North America, 777 patients from 46 centers in Europe, and 122 patients from 11 centers in Asia/South America. The 24-month Kaplan-Meier major infection rates were 1.2% in North America (30 pts), 2.5% in Europe (16 pts), and 4.3% Asia/South America (5 pts) (see Figure). These geographical variations in the incidence of major CIED infections were significant (overall P = 0.008, univariate). There were differences in baseline patient characteristics, including age, sex, medication use, NYHA Class, and number of previous devices across geographies. Differences also included procedural characteristics, such as device type, use of pocket wash, skin preparation, pre-operative antibiotic drug use, and procedure time. Conclusion Major CIED infection rates vary significantly across geographies. The effect of patient demographics and procedural characteristics on these findings will be assessed and presented at EHRA. Insights into geographical variability of CIED infections is important to mitigate infection risk, reduce morbidity and cost. Abstract Figure. Major CIED Infection Rate by Geography

2019 ◽  
Vol 47 (2) ◽  
pp. E3 ◽  
Author(s):  
Samuel L. Rubeli ◽  
Donato D’Alonzo ◽  
Beate Mueller ◽  
Nicole Bartlomé ◽  
Hans Fankhauser ◽  
...  

OBJECTIVEThe objective of this study was to quantify surgical site infection (SSI) rates after cranial neurosurgery in a tertiary care hospital, identify risk factors for SSI, and evaluate the impact of standardized surveillance and an infection prevention bundle (IPB).METHODSThe authors compared SSI rates during 7 months before and after the intervention. The IPB included standardized patient preparation, perioperative antibiotic/antiseptic use, barrier precautions, coaching of surgeons, and the implementation of a specialized technical operation assistant team.RESULTSThree hundred twenty-two unselected consecutive patients were evaluated before the IPB, and 296 were evaluated after implementation. Infection rates after 1 year decreased from 7.8% (25/322) to 3.7% (11/296, p = 0.03) with similar mortality rates (14.7% vs 13.8%, p = 0.8). The isolated bacteria included Staphylococcus aureus (42%), Cutibacterium acnes (22%), and coagulase-negative staphylococci (14%). Organ/space infections dominated with 67%, and mostly consisted of subdural empyema and meningitis/ventriculitis. Among the 36 SSIs, 13 (36%) occurred during hospitalization, and 29 (81%) within the first 3 months of follow-up. In multivariable analysis including established risk factors described in the literature, non-CNS neoplasia (odds ratio [OR] 3.82, 95% confidence interval [CI] 1.39–10.53), postoperative bleeding (OR 4.09, 1.44–11.62), operations performed by or under supervision of a senior faculty surgeon (OR 0.38, 0.17–0.84), and operations performed after the implementation of standardized surveillance and an IPB (OR 0.38, 0.17–0.85) significantly influenced the infection rate.CONCLUSIONSThe introduction of an IPB combined with routine surveillance and personal feedback was associated with a 53% reduced infection rate. The lower infection rates of senior faculty and the strong association between postoperative bleeding and infection underline the importance of both surgical experience as well as thorough supervision and coaching of younger surgeons.


2018 ◽  
Vol 29 (1) ◽  
pp. 108-114 ◽  
Author(s):  
Nitin Agarwal ◽  
Prateek Agarwal ◽  
Ashley Querry ◽  
Anna Mazurkiewicz ◽  
Zachary J. Tempel ◽  
...  

OBJECTIVEPrevious studies have demonstrated the efficacy of infection prevention protocols in reducing infection rates. This study investigated the effects of the development and implementation of an infection prevention protocol that was augmented by increased physician awareness of spinal fusion surgical site infection (SSI) rates and resultant cost savings.METHODSA cohort clinical investigation over a 10-year period was performed at a single tertiary spine care academic institution. Preoperative infection control measures (chlorohexidine gluconate bathing, Staphylococcus aureus nasal screening and decolonization) followed by postoperative infection control measures (surgical dressing care) were implemented. After the implementation of these infection control measures, an awareness intervention was instituted in which all attending and resident neurosurgeons were informed of their individual, independently adjudicated spinal fusion surgery infection rates and rankings among their peers. During the course of these interventions, the overall infection rate was tracked as well as the rates for those neurosurgeons who complied with the preoperative and postoperative infection control measures (protocol group) and those who did not (control group).RESULTSWith the implementation of postoperative surgical dressing infection control measures and physician awareness, the postoperative spine surgery infection rate decreased by 45% from 3.8% to 2.1% (risk ratio 0.55; 95% CI 0.32–0.93; p = 0.03) for those in the protocol cohort, resulting in an estimated annual cost savings of $291,000. This reduction in infection rate was not observed for neurosurgeons in the control group, although the overall infection rate among all neurosurgeons decreased by 54% from 3.3% to 1.5% (risk ratio 0.46; 95% CI 0.28–0.73; p = 0.0013).CONCLUSIONSA novel paradigm for spine surgery infection control combined with physician awareness methods resulted in significantly decreased SSI rates and an associated cost reduction. Thus, information sharing and physician engagement as a supplement to formal infection control measures result in improvements in surgical outcomes and costs.


2019 ◽  
Vol 6 (Supplement_2) ◽  
pp. S834-S835
Author(s):  
Pierre-Jean Maziade ◽  
Daniel Lussier ◽  
Francoise Dubé

Abstract Background Hospitals use multiple concurrent prevention strategies to curb nosocomial C. difficile infection, but there are limited data on the long-term feasibility or safety of using a probiotic. Pierre-Le Gardeur Hospital, Québec, has been administering a probiotic comprised of Lactobacillus acidophilus CL1285, L. casei LBC80R and L. rhamnosus CLR2 since 2004 with documented results through March 31, 2014. Here we present an update for the past 5 years. Methods Several nosocomial infection prevention practices were running concurrently at the hospital. Adult inpatients treated with antibiotics from April 1, 2014 to March 31, 2019 were eligible to receive the probiotic. The hospital pharmacy ensured that each patient took the probiotic capsules (Bio-K+® 50 Billion) daily from the initiation of antibiotic use. Confirmed nosocomial cases of C. difficile infection were recorded and reported to the provincial public health agency. The rate of nosocomial CDI for this hospital was compared with other non-University affiliated hospitals in the health region with more than 110 beds and fewer than 45% of patients age 65 and older, and, to all other hospitals in the health system. Results Cumulatively over the past 15 years, more than sixty thousand antibiotic-treated adult inpatients took the probiotic daily during antibiotic use. Among 13 comparable hospitals, Pierre-Le Gardeur Hospital had the lowest rate of nosocomial CDI in 2014–2015, 2015–2016, 2016–2017, 2017–2018 and on average had the lowest rate for 2013–2018 (1.1 CDI cases per 10,000 patient-days). Compared with all hospitals in the Province of Quebec health system, N = 95, the hospital had the lowest nosocomial CDI rate on average for 2013–2018. No cases of Lactobacillus bacteremia were detected. Conclusion The overall infection prevention strategy has been highly effective, resulting in a consistently low rate of nosocomial CDI. We found that it is feasible to administer this probiotic to antibiotic-treated inpatients with few restrictions. No Lactobacillus infections were observed from any of the three strains of bacteria for this probiotic when given to more than sixty thousand adult inpatients. Disclosures All authors: No reported disclosures.


Blood ◽  
2004 ◽  
Vol 104 (11) ◽  
pp. 380-380 ◽  
Author(s):  
Lynn K. Boshkov ◽  
Anthony Furnary ◽  
Cynthia Morris ◽  
Grace Chien ◽  
Donna VanWinkle ◽  
...  

Abstract Between November 1999 and August 2002, consenting adult elective cardiac surgery patients at Oregon Health & Science University, Portland Veteran’s Administration Medical Center, and St. Vincent’s Hospital who were undergoing cardiopulmonary bypass (CPB) were randomized at admission to receive either prestorage leukoreduced red cells (PSL-RBCs) or standard red cells (S-RBCs) in a prospective double-blind fashion. Only data from those transfused were analyzed. Outcome measures included death at 60 days, 60 day infection rate, and length of hospital stay (LOS). Patients at all 3 institutions were operated on by the same group of cardiovascular surgeons. Given higher baseline infection rates for coronary artery bypass grafts (CABG) randomization was stratified by CABG vs valve replacement (VR). All RBCs were issued with blinding hoods. All platelet transfusion were prestorage leukoreduced. RBC transfusion rates were 30% for CABG, 38 % for VR, and 63% for CABG + VR. Infections were determined by infection control nurses using standardized Centers for Disease Control criteria from hospital surveillance and records and follow-up phone calls. Deaths were determined from hospital records and follow-up calls, and verified by National Death Index data. The PSL-RBC arm included 304 patients and the S-RBC arm 258 patients. The two groups were well-matched demographically and by cardiovascular risk factors. Intent-to-treat analysis showed a 60 day mortality of 9.7% in the S-RBC arm and of 4.9% in the PSL-RBC arm (p=0.029). Heart failure as the sentinel cause of death accounted for most of the difference (45.5% of deaths in the S-RBC group vs 13.3% in the PSL-LR group). Death rates were procedure specific: CABG alone > CABG + VR > VR alone. There was no significant difference between the S-RBC and PSL-RBC groups with regard to overall infection rate at 60 days. Most infections were superficial wound infections in the CABG patients; however groups did not differ in more serious infections such as bacteremia (p=0.369) or pneumonia (p=0.360). There was no significant difference between the groups with respect to LOS exclusive of in-hospital deaths. Our results essentially replicate in a North American context those of a previous European trial (Van de Watering et al Circulation1998; 97:562) involving elective cardiac surgery patients undergoing CABG and/or VR surgery randomized to receive S-RBCs prepared by the European buffy coat method vs leukoreduced RBCs. Despite technical differences in RBC preparation, the excess deaths in both studies in the S-RBC group vs the leukoreduced group suggests that elective cardiac surgery patients undergoing CPB constitute an at-risk group both in the US and Europe which may benefit from use of PSL-RBC. The significance of transfusion-related immunomodulation (TRIM) in man has been the subject of intense controversy. Interestingly the cause of the increased mortality in the S-RBC group, both in this study and the European study, could not be explained by differences in infection rates. Given the preponderance of deaths in the CABG patients it is tempting to speculate this may reflect an interaction between residual passenger leukocytes and ischemia which is independent of the TH1/TH2 lymphocyte shift postulated to underlie TRIM.


1996 ◽  
Vol 24 (3) ◽  
pp. 330-333 ◽  
Author(s):  
P. V. van Heerden ◽  
S. A. R. Webb ◽  
S. Fong ◽  
C. L. Golledge ◽  
B. L. Roberts ◽  
...  

Sixty-one consecutive patients in the Intensive Care Unit requiring central venous lines (CVC) for five or more days were randomized to receive either a standard triple lumen CVC (STD/CVC) or a silver sulphadiazine and chlorhexidine impregnated CVC (SSD/CVC). Data from the 54 patients who completed the trial show a reduced infection rate (positive tip culture) in the SSD/CVC group (4 out of 28) compared to the STD/CVC group (10 out of 26) (P<0.05). In addition, the new Fibrin Analysing System (FAS) brush was evaluated and used to determine the presence of infection in all the CVCs (STD/CVC and SSD/CVC combined, n=54) at day 3 (i.e. early warning of CVC colonization/infection) and at the time of removal of the CVC. The FAS brush was able to detect an infected CVC on only one occasion on day 3 out of the 14 CVC tips which were later found to be colonized/infected at the time of removal. The sensitivity of the FAS brush in detecting colonized/infected CVCs at the time of CVC removal compared with CVC tip culture was 21% with a specificity of 100%. These findings would currently not support the routine use of the FAS brush in determining CVC infection/colonization.


2020 ◽  
Author(s):  
CHARLES ROBERTO TELLES

Cumulative COVID-19 daily new cases dataset during January to April, 2020 were used to search for evidences of SARS-CoV-2 spreading patterns (transmission forms) in the geographical regions with samples of Asia, South America, North America, Middle East, Africa and European countries. In order to comprehend the cause of constant infection rates for some countries, while others present very low daily new cases (China and South Korea), this research investigated possible aerosols forming patterns in the atmosphere and its relation to policy measures adopted by selected countries.


2017 ◽  
Vol 43 (2) ◽  
pp. 208 ◽  
Author(s):  
Daniele Cristine Hoffmann Schlesener ◽  
Jutiane Wollmann ◽  
Juliano De Bastos Pazini ◽  
Anderson Dionei Grützmacher ◽  
Flávio Roberto Mello Garcia

Drosophila suzukii (Diptera, Drosophilidae) is an exotic species, endemic to Asia and currently a pest to small and stone fruits in several countries of North America and Europe. It was detected in 2013 for the first time in South America, in the south of Brazil. Unlike most drosophilids, this species deserves special attention, because the females are capable of oviposit inside healthy fruits, rendering their sale and export prohibited. Despite the confirmed existence of this species in different states of Brazil, this insect is yet been to be given the pest status. Nevertheless, the mere presence of this species is enough to cause concern to producers of small fruits and to justify further investigation for it’s control, especially chemical control for a possible change in status. Therefore, the goal of this work was to evaluate, in laboratory, mortality of D. suzukii adults and ovicidal effect when exposed to different insecticides registered for species of the Tephritidae and Agromyzidae families in different cultures. The insecticides deltamethrin, dimethoate, spinosad, fenitrothion, phosmet, malathion, methidathion, and zeta-cypermethrin resulted in mortality to 100 % of the subjects three days after the treatment (DAT). Regarding the effects over eggs, it was  established that the insecticides fenitrothion, malathion, and methidathion deemed 100 % of the eggs not viable, followed by phosmet and diflubenzuron, which also caused elevated reduction in the eclosion of larvae two DAT.


2020 ◽  
Vol 16 (4) ◽  
pp. 327-333
Author(s):  
Shannon Armstrong-Kempter ◽  
Lucinda Beech ◽  
Sarah J. Melov ◽  
Adrienne Kirby ◽  
Roshini Nayyar

Background: The discovery of the benefits of antenatal corticosteroids (ACS) for preterm infants was one of the most significant developments in obstetric care. However, due to the difficulty in predicting preterm delivery, optimal use of ACS, is challenging. Objective: To describe prescribing practices for antenatal corticosteroids (ACS) at a tertiary hospital over five years to determine whether ACS were received at optimal timing; to determine patient characteristics of women receiving ACS at optimal timing; to determine patient characteristics of those who did not receive ACS as indicated and to examine the trend in ACS prescribing over the study period. Methods: We performed a retrospective study of all deliveries from January 2011 to December 2015. The rates of ACS prescription for each group of women (preterm, late preterm, and term) were recorded and analysed. Results: A total of 65% of women who delivered before 34 weeks’ gestation received ACS. Of these women, 63% delivered within 7 days of receiving ACS. Women most likely to receive ACS with optimal timing were primiparous (relative risk [RR], 1.25 [CI, 1.08-1.45]), or women diagnosed with pre-eclampsia (RR, 1.34 [CI 1.10-1.63]), preterm premature rupture of membranes (RR, 1.33 [CI, 1.15-1.54]) or threatened preterm labour (RR, 1.42 [CI, 1.22-1.65]). Conclusion: A significant number of women and babies are exposed to ACS without commensurate benefit, and a significant number who deliver preterm do not receive ACS. The percentage of preterm and term infants receiving ACS should be determined to optimise service delivery.


2021 ◽  
Vol 108 (Supplement_2) ◽  
Author(s):  
H Mistry ◽  
B Woolner ◽  
A John

Abstract Introduction Open abdominal surgery confers potentially greater risk of surgical site infections, and local evidence suggests use of drains can reduce this. Our objectives were: Assessing local rates and risk factors of infections and if use of drains can reduce the rates of infections. Method Retrospectively looking from 01/01/2018 to 31/12/2018, at patients following laparotomy or open cholecystectomy. Data collection on demographics, smoking/alcohol status, heart, respiratory or renal disease or diabetes, steroid use and CEPOD status, as well as use of drain and the outcome of infection using inpatient and online patient records. Results 84 patients included, 25 had drains inserted. There were 13 documented cases of surgical site infection, all of whom had no drain post-op. Other parameters shown to be most prevalent in the patients with a surgical site infection include being current/ex-smoker (8/13), having heart disease (9/13), and elective procedures. Conclusions Aiming to reduce the risk of surgical site infections can improve morbidity and potentially mortality outcomes. Our audit data showed that there appears to be a benefit of inserting intra-abdominal or subcutaneous drains. We will create a standard operating procedure of all patient to receive drains post-op and then re-audit to assess the impact this has on infection rates.


2021 ◽  
Vol 14 (1) ◽  
Author(s):  
Mojtaba Bahreh ◽  
Bahador Hajimohammadi ◽  
Gilda Eslami

Abstract Objective Toxoplasmosis, caused by Toxoplasma gondii, infects humans by consuming infected raw or undercooked meat and foods harboring mature oocysts. In this study, we assessed the prevalence of T. gondii in sheep and goats coming from central Iran. After completing the questionnaire, about one gram of liver or diaphragm tissue was taken as a sample from 90 sheep and 90 goats slaughtered in Yazd Province and stored at – 20 ºC. DNA extraction was done, and then T. gondii was detected using nested PCR. Results This study indicated that the prevalence of T. gondii in all slaughtered animals was 11.6% (21 of 180), including 14.4% (13/90) in sheep and 8.8% (8/90) in goats. The infection rates in liver and diaphragm samples were 12.2% (11/90) and 11.1% (10/90), respectively (p = 0.8163). The infection rate in animals older than one was 16.3% (15/92), and it was 6.8% (6/88) in animals under one year of age. Therefore, no significant differences were found (p = 0.475). Infection rates were 19.5% (18/92) in males and 3.4% (3/88) in females (p = 0.0007). In conclusion, the infection rates of toxoplasmosis in livestock in this area are almost high, and therefore, it is necessary to design appropriate prevention programs to control the disease.


Sign in / Sign up

Export Citation Format

Share Document