Effect of Education on the Rate of and the Understanding of Risk Factors for Intravascular Catheter–Related Infections

2007 ◽  
Vol 28 (6) ◽  
pp. 689-694 ◽  
Author(s):  
G. Yilmaz ◽  
R. Caylan ◽  
K. Aydin ◽  
M. Topbas ◽  
I. Koksal

Objective.Intravascular catheters are indispensable tools in modern medical therapy. In spite of their great benefits, however, the widespread use of catheters leads to several complications, including infections that cause significant morbidity, mortality, and economic losses for hospitalized patients.Design.This study was conducted at Farabi Hospital, a 495-bed facility at Karadeniz Technical University Medical School in Trabzon, Turkey, and involved 3 separate periods: preeducation, education, and posteducation. Patients with intravascular catheters were monitored daily, as were the results of their physical examinations. The information acquired was recorded in a questionnaire.Results.During the preeducation period (October 2003 through March 2004), 405 intravascular catheters inserted into 241 patients were observed for 5,445 catheter-days. Seventy-one cases of intravascular catheter-related infection (CRI) were identified, giving a CRI rate of 13.04 infections per 1,000 catheter-days. The catheter-related bloodstream infection (CRBSI) rate was 8.3 infections per 1,000 catheter-days, and the exit-site infection (ESI) rate was 3.5 infections per 1,000 catheter-days. During the posteducation period (June through November 2004), 365 intravascular catheters inserted into 193 patients were observed for 5,940 catheter-days. Forty-five cases of CRI were identified, giving a rate of 7.6 infections per 1,000 catheter-days. The CRBSI rate was 4.7 infections per 1,000 catheter-days, and the ESI rate was 2.2 infections per 1,000 catheter-days. When findings from the 2 periods were compared, it was determined that education reduced CRI incidence by 41.7%.Conclusion.CRI can be prevented when hospital personnel are well informed about these infections. We compared the knowledge levels of the relevant personnel in our hospital before and after theoretical and practical training and identified a significant increase in knowledge after training (P < .0001 ). Parallel to this, although still below ideal levels, we identified a significant improvement in the incidence of CRI during the posteducation period (P = .004). The rate was low for the first 3 months of this period but increased 2.08 times after the third month. In conclusion, regular training for the residents in charge of inserting intravascular catheters and the nurses and interns who maintain the catheters is highly effective in reducing the rate of CRI in large teaching hospitals.

Landslides ◽  
2021 ◽  
Author(s):  
Lorenzo Brezzi ◽  
Alberto Bisson ◽  
Davide Pasa ◽  
Simonetta Cola

AbstractA large number of landslides occur in North-Eastern Italy during every rainy period due to the particular hydrogeological conditions of this area. Even if there are no casualties, the economic losses are often significant, and municipalities frequently do not have sufficient financial resources to repair the damage and stabilize all the unstable slopes. In this regard, the research for more economically sustainable solutions is a crucial challenge. Floating composite anchors are an innovative and low-cost technique set up for slope stabilization: it consists in the use of passive sub-horizontal reinforcements, obtained by coupling a traditional self-drilling bar with some tendons cemented inside it. This work concerns the application of this technique according to the observational method described within the Italian and European technical codes and mainly recommended for the design of geotechnical works, especially when performed in highly uncertain site conditions. The observational method prescribes designing an intervention and, at the same time, using a monitoring system in order to correct and adapt the project during realization of the works on the basis of new data acquired while on site. The case study is the landslide of Cischele, a medium landslide which occurred in 2010 after an exceptional heavy rainy period. In 2015, some floating composite anchors were installed to slow down the movement, even if, due to a limited budget, they were not enough to ensure the complete stabilization of the slope. Thanks to a monitoring system installed in the meantime, it is now possible to have a comparison between the site conditions before and after the intervention. This allows the evaluation of benefits achieved with the reinforcements and, at the same time, the assessment of additional improvements. Two stabilization scenarios are studied through an FE model: the first includes the stabilization system built in 2015, while the second evaluates a new solution proposed to further increase the slope stability.


1990 ◽  
Vol 10 (1) ◽  
pp. 31-35 ◽  
Author(s):  
Maurice Levy ◽  
J. Williamson Balfe ◽  
Dennis Geary ◽  
Sue Fryer-Keene ◽  
Robert Bannatyne

A 10-year retrospective review of pediatric patients on peritoneal dialysis showed that 50 of 83 had 132 episodes of exit-site infection (ESI). Thirty-nine episodes were purulent. The most prevalent organism was Staphylococcus aureus. Staphylococcus epidermidis was also common, usually occurring in purulent infections. Gramnegative organisms were responsible for 23 ESls, with Pseudomonas species being the most common. Age, sex, concomitant primary disease type, length of training, dressing techniques, quality of daily dialysis technique, use of diapers, and pyelostomies did not affect the incidence of ESI. However, 40% of children with a skin infection from other sites had associated peritoneal catheter ESI. Thirty-eight episodes of ESI in 28 patients resulted in peritonitis; the main organisms involved were Staphylococcus and Pseudomonas species. Catheters were replaced in 13 patients with peritonitis, but there was no difference in the incidence of ESI before and after catheter replacement.


2017 ◽  
Vol 10 (1) ◽  
pp. 01-08
Author(s):  
Anderson Gonçalves da Silva ◽  
Arlindo Leal Boiça Junior ◽  
Bruno Henrique Sardinha de Souza ◽  
Eduardo Neves Costa ◽  
James da Silva Hoelhert ◽  
...  

Resumo. A cultura do feijoeiro pode ser infestada por insetos que afetam a produção antes e após a colheita, tendo como estimativa de perdas causadas nos rendimentos pelas pragas variando de 33 a 86%. Dentre essas pragas a mosca-branca Bemisia tabaci (Genn.) Autor merece destaque. Esta ocasiona danos diretos decorrentes de sua alimentação e indiretos que ocorrem por meio da excreção açucarada de honeydew ou “mela” e simbiose com a fumagina. No entanto, o dano mais sério causado pela B. tabaci é a transmissão de viroses como o mosaico-dourado-do-feijoeiro, provocando perdas econômicas que podem variar de 30% a 100%. Desse modo, o objetivo do presente estudo é disponibilizar informações a respeito de aspectos importantes de B. tabaci, como: histórico e distribuição geográfica, bioecologia e dinâmica populacional, plantas hospedeiras, métodos de controle adotados, dentre outros, a fim de se fornecer subsídios para futuras pesquisas sobre a mosca-branca em feijão.Whitefly Bemisia tabaci (Genn.) (Hemiptera: Aleyrodidae) in common beans: General characteristics, bioecology, and methods of controlAbstract. Common bean plants are infested by insects, which can ultimately affect the crop production before and after harvest, with estimated losses ranging from 33 to 86%. Among the insect pests infesting the common beans the whitefly Bemisia tabaci (Genn.) stands out. This species cause direct injury by feeding on the plants and indirect injury by excreting sugary honeydew that is after colonized by the sooty mold. In addition, the most serious damage caused by B. tabaci is the transmission of virus diseases, especially the common bean golden mosaic, responsible for economic losses varying from 30 to 100%. This review aims at providing information on important aspects of B. tabaci including its geographical distribution, bioecology, population dynamics, host plants, and methods of pest control. We expect that this review can provide valuable subsidies for future studies on the whitefly in common beans.


CONVERTER ◽  
2021 ◽  
pp. 211-219
Author(s):  
Yongli Zou Et al.

Objectives: To analyze the effect of personal protective equipment training on new hospital infection managers. Methods: Personnel are divided into two batches by region. Adopt a diversified training model to train all personnel, finally conduct practical assessments and issue certificates. Collect information through information technology, analyze questionnaires, and understand trainees’ circumstances before and after the training. Each training batch has uniform teachers and the same training methods. Results: After the training, the trainees' proficiency in putting on and taking off protective equipment increased by 22.85%, and ability to choose protective equipment according to different working environments increased by 22.04%; 78.23% trainees believed that practical exercises should be emphasized. Taking off protective clothing was considered as the most difficult link in practical training (91.13%), followed by putting on protective clothing (70.43%). 96.24% trainees believed that this training is helpful for future work. Conclusions: It is quite necessary to implement personal protective equipment training among new hospital infection managers; where, practical training, assessment, information-based questionnaire survey, expert theory teaching have achieved good results; the training helps reduce occupational exposure-induced hospital infection, and at the same time, avoids improper use of protective materials and waste.


2008 ◽  
Vol 13 (4) ◽  
pp. 191-197 ◽  
Author(s):  
Liz Simcock

Abstract Background, Method and Purpose: The use of peripherally inserted central catheters (PICCs) in the UK has been steadily increasing since they were first introduced in 1995. Ultrasound-guided upper arm placement - which has become prevalent in the USA over the last few years - is gradually attracting interest amongst PICC placers in the UK. The literature shows that upper arm placement improves insertion success rate (Hockley, Hamilton, Young, Chapman, Taylor, Creed et al, 2007; Hunter, 2007; Krstenic, Brealey, Gaikwad & Maraveyas, 2008) and patient satisfaction (Polak, Anderson, Hagspiel, & Mungovan, 1998; Sansivero, 2000; McMahon, 2002). Following a switch to upper arm placement at her institution, the author examined audit data from before and after the change in practice to see if there were other measurable clinical improvements. Results: Comparison of data from a four-year period shows that upper arm placement in our patient population increased insertion success rate and line longevity, while reducing exit site infection, thrombosis and catheter migration. Implications for Practice: This data shows that ultrasound-guided upper-arm placement improves patient outcomes. PICC placers still using the more traditional antecubital approach should consider a change in practice.


2020 ◽  
Author(s):  
Yijia Zhang ◽  
Zhicong Yin ◽  
Huijun Wang

Abstract. North China experiences severe haze pollution in early winter, resulting in many premature deaths and considerable economic losses. The number of haze days in early winter in North China (HDNC) increased rapidly after 2010 but declined slowly before 2010, reflecting a trend reversal. Global warming and emissions were two fundamental drivers of the long-term increasing trend of haze, but no studies have focused on this trend reversal. The autumn SST in the Pacific and Atlantic, Eurasian snow cover and central Siberian soil moisture, which exhibited completely opposite trends before and after 2010, were proven to stimulate identical trends of meteorological conditions related to haze pollution in North China. Numerical experiments with a fixed emission level confirmed the physical relationships between the climate drivers and HDNC during both decreasing and increasing periods. These external drivers induced a larger decreasing trend of HDNC than the observations, and combined with the persistently increasing trend of anthropogenic emissions, resulted in a realistic slowly decreasing trend. However, after 2010, the increasing trends driven by these climate divers and human emissions jointly led to a rapid increase in HDNC.


2020 ◽  
Vol 27 (3) ◽  
pp. e100170
Author(s):  
Johanna I Westbrook ◽  
Neroli S Sunderland ◽  
Amanda Woods ◽  
Magda Z Raban ◽  
Peter Gates ◽  
...  

BackgroundElectronic medication systems (EMS) have been highly effective in reducing prescribing errors, but little research has investigated their effects on medication administration errors (MAEs).ObjectiveTo assess changes in MAE rates and types associated with EMS implementation.MethodsThis was a controlled before and after study (three intervention and three control wards) at two adult teaching hospitals. Intervention wards used an EMS with no bar-coding. Independent, trained observers shadowed nurses and recorded medications administered and compliance with 10 safety procedures. Observational data were compared against medication charts to identify errors (eg, wrong dose). Potential error severity was classified on a 5-point scale, with those scoring ≥3 identified as serious. Changes in MAE rates preintervention and postintervention by study group, accounting for differences at baseline, were calculated.Results7451 administrations were observed (4176 pre-EMS and 3275 post-EMS). At baseline, 30.2% of administrations contained ≥1 MAE, with wrong intravenous rate, timing, volume and dose the most frequent. Post-EMS, MAEs decreased on intervention wards relative to control wards by 4.2 errors per 100 administrations (95% CI 0.2 to 8.3; p=0.04). Wrong timing errors alone decreased by 3.4 per 100 administrations (95% CI 0.01 to 6.7; p<0.05). EMS use was associated with an absolute decline in potentially serious MAEs by 2.4% (95% CI 0.8 to 3.9; p=0.003), a 56% reduction in the proportion of potentially serious MAEs. At baseline, 74.1% of administrations were non-compliant with ≥1 of 10 procedures and this rate did not significantly improve post-EMS.ConclusionsImplementation of EMS was associated with a modest, but significant, reduction in overall MAE rate, but halved the proportion of MAEs rated as potentially serious.


2017 ◽  
Vol 4 (suppl_1) ◽  
pp. S406-S406
Author(s):  
Lou Ann Bruno-Murtha ◽  
Rebecca Osgood ◽  
Casey Alexandre ◽  
Rumel Mahmood

Abstract Background Our goal was to reduce the rate of hospital-onset (HO) C. difficile (CD) by prompt testing in patients with diarrhea on hospital day (HD) 1–3 using a nurse-driven testing protocol (NTP) with PCR and improve identification of disease after HD 3 using a combined toxin/antigen assay (TAA). Methods An automated best practice advisory/NTP was developed in Epic, triggered by documentation of diarrhea during HD 1–3, to facilitate prompt stool collection, testing and initiation of contact precautions. Education was provided. The NTP was fully implemented at 2 community-teaching hospitals mid-February 2016. The TAA was adopted 7/27/16 for testing after HD 3. Results In 2016, the standardized infection ratio (SIR) at Cambridge and Everett was 0.43 (P = 0.009) and 0.5 (P = 0.017), respectively, reflecting a 48–61% decrease from 2015. There was a 14–28% improvement in identifying cases as community-onset. The TAA led to a further decline in HO-CD by 10–61%. Refer to the graph for quarterly SIRs before and after implementation. Despite a 26% increase in testing volume, costs are less with the current strategy. Conclusion Prompt identification of CD improves care and prevents inflation of HO-CD. This strategy has enhanced our efforts to reduce our SIR (observed/expected cases) and resulted in a substantial incentive payment for CHA. Disclosures All authors: No reported disclosures.


2003 ◽  
Vol 23 (5) ◽  
pp. 456-459 ◽  
Author(s):  
Beth Piraino ◽  
Judith Bernardini ◽  
Tracey Florio ◽  
Linda Fried

Objective To examine gram-negative exit-site infection and peritonitis rates before and after the implementation of Staphylococcus aureus prophylaxis in peritoneal dialysis (PD) patients. Design Prospective data collection with periodic implementation of protocols to decrease infection rates in two PD programs. Patients 663 incident patients on PD. Interventions Implementation of S. aureus prophylaxis, beginning in 1990. Main Outcome Measures Rates of S. aureus, gram-negative, and Pseudomonas aeruginosa exit-site infections and peritonitis. Results Staphylococcus aureus exit-site infection and peritonitis rates fluctuated without significant trends during the first decade (without prophylaxis), then began to decline during the 1990s subsequent to implementation of prophylaxis, reaching levels of 0.02/year at risk and zero in the year 2000. Gram-negative infections fell toward the end of the 1980s, due probably to the implementation of better connectology. However, there have been no significant changes for the past 6 years. There was little change in P. aeruginosa infections over the entire time period. Pseudomonas aeruginosa is now the most common cause of catheter infection and catheter-related peritonitis. Conclusions Prophylaxis against S. aureus is highly effective in reducing the rate of S. aureus infections but has no effect on gram-negative infections. Pseudomonas aeruginosa is now the most serious cause of catheter-related peritonitis.


Sign in / Sign up

Export Citation Format

Share Document