scholarly journals Chlorhexidine versus Tincture of Iodine for Reduction of Blood Culture Contamination Rates: a Prospective Randomized Crossover Study

2016 ◽  
Vol 54 (12) ◽  
pp. 3007-3009 ◽  
Author(s):  
Elizabeth Story-Roller ◽  
Melvin P. Weinstein

Blood cultures (BCs) are the standard method for diagnosis of bloodstream infections (BSIs). However, the average BC contamination rate (CR) in U.S. hospitals is 2.9%, potentially resulting in unnecessary antibiotic use and excessive therapy costs. Several studies have compared various skin antisepsis agents without a clear consensus as to which agent is most effective in reducing contamination. A prospective, randomized crossover study directly comparing blood culture contamination rates using chlorhexidine versus iodine tincture for skin antisepsis was performed at Robert Wood Johnson University Hospital (RWJUH). Eight nursing units at RWJUH were provided with blood culture kits containing either chlorhexidine (CH) or iodine tincture (IT) for skin antisepsis prior to all blood culture venipunctures, which were obtained by nurses or clinical care technicians. At quarterly intervals, the antiseptic agent used on each nursing unit was switched. Analyses of positive BCs were performed to distinguish true BSIs from contaminants. Of the 6,095 total BC sets obtained from the participating nursing units, 667 (10.94%) were positive and 238 (3.90%) were judged by the investigators to be contaminated. Of the 3,130 BCs obtained using IT, 340 (10.86%) were positive and 123 (3.93%) were contaminated. Of 2,965 BCs obtained using CH, 327 (11.03%) were positive and 115 (3.88%) were contaminated. The rates of contaminated BCs were not statistically significant between the two antiseptic agents (P= 1.0). We conclude that CH and IT are equivalent agents for blood culture skin antisepsis.

Author(s):  
Jessica L Seidelman ◽  
Nicholas A Turner ◽  
Rebekah H Wrenn ◽  
Christina Sarubbi ◽  
Deverick J Anderson ◽  
...  

Abstract Background Few groups have formally studied the effect of dedicated antibiotic stewardship rounds (ASRs) on antibiotic use (AU) in intensive care units (ICUs). Methods We implemented weekly ASRs using a two-arm, cluster-randomized, crossover study in 5 ICUs at Duke University Hospital from 11/2017 to 6/2018. We excluded patients without an active antibiotic order, or if they had a marker of high complexity including an existing infectious disease consult, transplant, ventricular assist device, or ECMO. AU during and following ICU stay for patients with ASRs was compared to the controls. We recorded the number of reviews, recommendations delivered, and responses. We evaluated change in ICU-specific AU during and after the study. Results Our analysis included 4,683 patients: 2330 intervention and 2353 controls. Teams performed 761 reviews during ASRs, which excluded 1569 patients: 60% of patients off antibiotics, and 8% complex patients. Exclusions affected 88% the cardiac surgery ICU (CTICU) patients. AU rate ratio (RR) was 0.97 (0.91-1.04). When CTICU was removed, the RR was 0.93 (0.89-0.98). AU in the post-study period decreased by 16% (95% CI 11-24%) compared to the AU in the baseline period. Change in AU was differential among units: largest in the neurology ICU (-28%) and smallest in the CTICU (-2%). Conclusion Weekly multi-disciplinary ASRs was a high-resource intervention associated with a small AU reduction. The noticeable ICU AU decline over time is possibly due to indirect effects of ASRs. Effects differed among specialty ICUs, emphasizing the importance of customizing ASRs to match unit-specific population, workflow, and culture.


2020 ◽  
Vol 71 (Supplement_3) ◽  
pp. S285-S292
Author(s):  
Krista Vaidya ◽  
Kristen Aiemjoy ◽  
Farah N Qamar ◽  
Samir K Saha ◽  
Dipesh Tamrakar ◽  
...  

Abstract Background Antibiotic use prior to seeking care at a hospital may reduce the sensitivity of blood culture for enteric fever, with implications for both clinical care and surveillance. The Surveillance for Enteric Fever in Asia Project (SEAP) is a prospective study of enteric fever incidence in Nepal, Bangladesh, and Pakistan. Nested within SEAP, we evaluated the accuracy of self-reported antibiotic use and investigated the association between antibiotic use and blood culture positivity. Methods Between November 2016 and April 2019, we collected urine samples among a subset of SEAP participants to test for antibiotic use prior to the hospital visit using an antibacterial activity assay. All participants were asked about recent antibiotic use and had a blood culture performed. We used mixed-effect logit models to evaluate the effect of antimicrobial use on blood culture positivity, adjusted for markers of disease severity. Results We enrolled 2939 patients with suspected enteric fever. Antibiotics were detected in 39% (1145/2939) of urine samples. The correlation between measured and reported antibiotic use was modest (κ = 0.72). After adjusting for disease severity, patients with antibiotics in their urine were slightly more likely to be blood culture positive for enteric fever; however, the effect was not statistically significant (prevalence ratio, 1.22 [95% confidence interval, .99–1.50]). Conclusions The reliability of self-reported prior antibiotic use was modest among individuals presenting with fever to tertiary hospitals. While antibiotics are likely to reduce the sensitivity of blood culture, our findings indicate that there is still considerable value in performing blood culture for individuals reporting antibiotic use.


Pharmaceutics ◽  
2022 ◽  
Vol 14 (1) ◽  
pp. 198
Author(s):  
Nao Mitsui ◽  
Noriko Hida ◽  
Taro Kamiya ◽  
Taigi Yamazaki ◽  
Kazuki Miyazaki ◽  
...  

Minitablets have garnered interest as a new paediatric formulation that is easier to swallow than liquid formulations. In Japan, besides the latter, fine granules are frequently used for children. We examined the swallowability of multiple drug-free minitablets and compared it with that of fine granules and liquid formulations in 40 children of two age groups (n = 20 each, aged 6–11 and 12–23 months). We compared the percentage of children who could swallow minitablets without chewing with that of children who could swallow fine granules or liquid formulations without leftover. The children who visited the paediatric department of Showa University Hospital were enrolled. Their caregivers were allowed to choose the administration method. In total, 37 out of 40 caregivers dispersed the fine granules in water. Significantly more children (80%, 95% CI: 56–94%) aged 6–11 months could swallow the minitablets than those who could swallow all the dispersed fine granules and liquid formulations (22%, 95% CI: 6–47% and 35%, 95% CI: 15–59%, respectively). No significant differences were observed in children aged 12–23 months. Hence, minitablets may be easier to swallow than dispersed fine granules and liquid formulations in children aged 6–11 months.


2003 ◽  
Vol 24 (12) ◽  
pp. 936-941 ◽  
Author(s):  
William E. Scheckler ◽  
James A. Bobula ◽  
Mark B. Beamsley ◽  
Scott T. Hadden

AbstractObjective:To examine the current status of bloodstream infections (BSIs) in a community hospital as part of a 25-year longitudinal study.Design:Retrospective descriptive epidemiologic study.Setting:Community teaching hospital.Patients:All inpatients in 1998 with a positive blood culture who met the CDC NNIS System case definition of BSI.Methods:Cases were stratified by underlying illness category using case mix adjustment categories (after McCabe) and reviewed for associations among mortality, underlying illness severity, and multiple clinical and laboratory parameters.Results:Of 19,289 patients discharged in 1998,185 had an episode of infection documented by blood culture (96 cases per 10,000 inpatients). BSI was twice as frequent in patients 65 years and older compared with younger patients. BSIs caused or contributed to the deaths of 22 patients for an overall case-fatality rate of 11.9% compared with 20.7% in 1982 (P = .02). Striking decreases were noted for in-hospital patient mortality in 1998 for BSIs with ultimately and rapidly fatal underlying illnesses (P = .02 and P < .10, respectively). Primary bacteremia decreased compared with 1982. Antibiotic use was vigorous, but resistance was modest in both nosocomial and community-acquired organisms and had changed little from 1982 and 1987.Conclusions:Compared with previous studies, case-fatality rates in patients with BSI were substantially lower in rapidly fatal and ultimately fatal underlying illness categories. Antibiotic use was extensive but prompt and appropriate. Microorganism resistance to antibiotics changed little from the 1980s.


2002 ◽  
Vol 23 (7) ◽  
pp. 397-401 ◽  
Author(s):  
Barbara W. Trautner ◽  
Jill E. Clarridge ◽  
Rabih O. Darouiche

Objective:Skin preparation is an important factor in reducing the rate of blood culture contamination. We assessed blood culture contamination rates associated with the use of skin antisepsis kits containing either 2% alcoholic chlorhexidine gluconate or 2% alcoholic tincture of iodine.Design:Prospective, blinded clinical trial.Setting:Tertiary-care teaching hospital.Patients:Adult patients in medical wards, the medical intensive care unit, and the cardiac intensive care unit who needed paired, percutaneous blood cultures.Interventions:House officers, medical students, and healthcare technicians drew the blood for cultures. We prepared sacks containing all of the necessary supplies, including two different types of antiseptic kits. In each sack, one kit contained 2% chlorhexidine in 70% isopropyl alcohol and the other contained 2% tincture of iodine in ethyl alcohol and 70% isopropyl alcohol. Each patient received chlorhexidine at one site and tincture of iodine at the other.Results:Four (0.9%) of 430 blood culture sets from 215 patients were contaminated. The contamination rate when using alcohol and chlorhexidine (1 of 215, 0.5%) did not differ significantly from the contamination rate when using tincture of iodine (3 of 215,1.4%;P= .62, McNemar test). There was an 87% probability that the two interventions differed by less than 2% in their rate of contamination.Conclusions:Both of these antiseptic kits were highly effective for skin preparation prior to drawing blood for cultures. The use of these kits may have contributed to the low contamination rate observed in this study.


2021 ◽  
Vol 73 (6) ◽  
pp. 406-412
Author(s):  
Tharntip Sangsuwan ◽  
Rungtip Darayon ◽  
Silom Jamulitrat

Objective: To determine blood culture contamination rates, and display with a g-chart.Materials and Methods: The medical records of patients, from whom blood cultures were obtained in a university hospital, during January and December 2019 were retrieved and reviewed for contamination. The Center for Disease Control and Prevention (CDC) criteria were used to classify the blood culture results. The contamination rates were illustrated with a g-chart.Results: We identified 331 false-positive blood cultures, among 32,961 cultured specimens; yielding a contamination rate of 1.0% (95%CI = 0.9% – 1.1%). The highest contamination events occurred in the Emergency department (49.2%), Pediatric ICU (5.2%) and Neonatal ICU (4.8%), respectively. The most common commensal bacterial genus were Staphylococcus coagulase negative (67.1%), Bacillus spp. (10.2%) and Corynebacterium spp. (7.6%), correspondingly. The g-charts could identify 14 abnormal variations, in 41 locations.Conclusion: The contamination rates found were within ranges of other reports. G-charts are simple to construct, easy to interpret and sensitive for detection of real time epidemics.


2021 ◽  
Author(s):  
Koshi Ota ◽  
Daisuke Nishioka ◽  
Yuri Ito ◽  
Emi Hamada ◽  
Naomi Mori ◽  
...  

Abstract Background: Blood cultures are indispensable for detecting life-threatening bacteremia. Little is known about associations between contamination rates and topical disinfectants for blood collection in adults.Objective: We sought to determine whether a change in topical disinfectants was associated with the rates of contaminated blood cultures in the emergency department of a single institution.Methods: This single-center, retrospective observational study of consecutive patients aged 20 years or older was conducted in the emergency department (ED) of a university hospital in Japan between August 1, 2018 and September 30, 2020. Pairs of blood samples were collected for aerobic and anaerobic culture from the patients in the ED. Physicians selected topical disinfectants according to their personal preference before September 1, 2019; alcohol/chlorhexidine gluconate (ACHX) was mandatory thereafter, unless the patient was allergic to alcohol. Regression discontinuity analysis was used to detect the effect of the mandatory usage of ACHX on rates of contaminated blood cultures.Results: We collected 2,141 blood culture samples from 1097 patients and found 164 (7.7%) potentially contaminated blood cultures. Among these, 445 (20.8%) were true bacteremia and 1,532 (71.6%) were true negatives. Puncture site disinfection was performed with ACHX for 1,345 (62.8%) cases and with povidone-iodine (PVI) for 767 (35.8%) cases. The regression discontinuity analysis showed that mandatory ACHX usage significantly reduced the blood culture contamination rate by 9.6% (95% confidence interval (CI): 5.0%–14.2%, P <0.001).Conclusion: Rates of contaminated blood cultures were significantly lower when ACHX was used as the topical disinfectant.


2010 ◽  
Vol 31 (05) ◽  
pp. 516-521 ◽  
Author(s):  
Ianick Souto Martins ◽  
Flávia Lúcia Piffano Costa Pellegrino ◽  
Andrea d'Avila Freitas ◽  
Marisa da Silva Santos ◽  
Giovanna lanini d'Alemeida Ferraiuoli ◽  
...  

Objective. To investigate an outbreak of healthcare-associated Burkholderia cepacia complex (BCC) primary bloodstream infections (BCC-BSI). Design and Setting. Case-crossover study in a public hospital, a university hospital and a private hospital in Rio de Janeiro, Brazil, from March 2006 to May 2006. Patients. Twenty-five patients with BCC-BSI. Design. After determining the date BCC-BSI symptoms started for each patient, 3 time intervals of data collection were defined, each one with a duration of 3 days: the case period, starting just before BCC-BSI symptoms onset; the control period, starting 6 days before BCC-BSI symptoms onset; and the washout period, comprising the 3 days between the case period and the control period. Exposures evaluated were intravascular solutions and invasive devices and procedures. Potential risk factors were identified by using the McNemar χ2 adjusted test. Cultures of samples of potentially contaminated solutions were performed. BCC strain typing was performed by pulsed-field gel electrophoresis using Spel. Results. The statistical analysis revealed that the use of bromopride and dipyrone was associated with BCC-BSI. A total of 21 clinical isolates from 17 (68%) of the 25 patients and an isolate obtained from the bromopride vial were available for strain typing. Six pulsotypes were detected. A predominant pulsotype (A) accounted for 11 isolates obtained from 11 patients (65%) in the 3 study hospitals. Conclusion. Our investigation, using a case-crossover design, of an outbreak of BCC-BSI infections concluded it was polyclonal but likely caused by infusion of contaminated bromopride. The epidemiological finding was validated by microbiological analysis. After recall of contaminated bromopride vials by the manufacturer, the outbreak was controlled.


F1000Research ◽  
2018 ◽  
Vol 7 ◽  
pp. 1770
Author(s):  
Francis Maina Kiroro ◽  
Majid Twahir

Background: This study was focussed on survival rates of patients admitted to acute care units who utilized medical devices known as central venous catheters (CVC). CVCs are useful devices in clinical care; however some infections such as central line associated bloodstream infections (CLABSI) may occur, which are associated with increased lengths of stay and costs as well as higher morbidity and mortality rates. The overall objective of the present study was to determine survival probabilities and hazard rates for patients who used CVC devices and compare the subgroups by infection status. Methods: The study was focused on all patients who were admitted to Critical Care Units between 8th December 2012 and 31st March 2016 and utilized CVC devices. It was a retrospective study. Survival analysis techniques, test of equality of proportions, Man-Whitney test and Chi–square test of independence were used. Results: A total of 363 out of 1089 patients included in the study died during hospitalization. 47 patients developed nosocomial CLABSI. The average duration was 18.19 days and median of 12 days for hospitalized patients who did not develop a nosocomial CLABSI compared to an average of 56.79 days and a median of 51 days for those who did. There was a significantly higher proportion of mortality of those who developed nosocomial CLABSI compared to those that didn’t (p-value=0.01379). The results indicate that there was a significant association between infection status and discharge status, and significant difference to the survival rates of the patients based on infection status. Conclusions: There is a significant impact on mortality and morbidity of patients who develop nosocomial CLABSI. The duration of hospitalization by patients who developed CLABSI was significantly higher compared to patients who did not. Increased length of stay leads to higher cost of hospitalization.


2010 ◽  
Vol 31 (2) ◽  
pp. 171-176 ◽  
Author(s):  
Lauren Marlowe ◽  
Rakesh D. Mistry ◽  
Susan Coffin ◽  
Kateri H. Leckerman ◽  
Karin L. McGowan ◽  
...  

Objective.To determine blood culture contamination rates after skin antisepsis with Chlorhexidine, compared with povidone-iodine.Design.Retrospective, quasi-experimental study.Setting.Emergency department of a tertiary care children's hospital.Patients.Children aged 2-36 months with peripheral blood culture results from February 2004 to June 2008. Control patients were children younger than 2 months with peripheral blood culture results.Methods.Blood culture contamination rates were compared using segmented regression analysis of time-series data among 3 patient groups: (1) patients aged 2-36 months during the 26-month preintervention period, in which 10% povidone-iodine was used for skin antisepsis before blood culture; (2) patients aged 2-36 months during the 26-month postintervention period, in which 3% Chlorhexidine gluconate was used; and (3) patients younger than 2 months not exposed to the Chlorhexidine intervention (ie, the control group).Results.Results from 11,595 eligible blood cultures were reviewed (4,942 from the preintervention group, 4,274 from the postintervention group, and 2,379 from the control group). For children aged 2-36 months, the blood culture contamination rate decreased from 24.81 to 17.19 contaminated cultures per 1,000 cultures (P< .05) after implementation of Chlorhexidine. This decrease of 7.62 contaminated cultures per 1,000 cultures (95% confidence interval, —0.781 to —15.16) represented a 30% relative decrease from the preintervention period and was sustained over the entire postintervention period. No change in contamination rate was observed in the control group (P= .337).Conclusion.Skin antisepsis with Chlorhexidine significantly reduces the blood culture contamination rate among young children, as compared with povidone-iodine.


Sign in / Sign up

Export Citation Format

Share Document