scholarly journals Evaluation of a Disinfectant Wipe Intervention on Fomite-to-Finger Microbial Transfer

2014 ◽  
Vol 80 (10) ◽  
pp. 3113-3118 ◽  
Author(s):  
Gerardo U. Lopez ◽  
Masaaki Kitajima ◽  
Aaron Havas ◽  
Charles P. Gerba ◽  
Kelly A. Reynolds

ABSTRACTInanimate surfaces, or fomites, can serve as routes of transmission of enteric and respiratory pathogens. No previous studies have evaluated the impact of surface disinfection on the level of pathogen transfer from fomites to fingers. Thus, the present study investigated the change in microbial transfer from contaminated fomites to fingers following disinfecting wipe use.Escherichia coli(108to 109CFU/ml),Staphylococcus aureus(109CFU/ml),Bacillus thuringiensisspores (107to 108CFU/ml), and poliovirus 1 (108PFU/ml) were seeded on ceramic tile, laminate, and granite in 10-μl drops and allowed to dry for 30 min at a relative humidity of 15 to 32%. The seeded fomites were treated with a disinfectant wipe and allowed to dry for an additional 10 min. Fomite-to-finger transfer trials were conducted to measure concentrations of transferred microorganisms on the fingers after the disinfectant wipe intervention. The mean log10reduction of the test microorganisms on fomites by the disinfectant wipe treatment varied from 1.9 to 5.0, depending on the microorganism and the fomite. Microbial transfer from disinfectant-wipe-treated fomites was lower (up to <0.1% on average) than from nontreated surfaces (up to 36.3% on average, reported in our previous study) for all types of microorganisms and fomites. This is the first study quantifying microbial transfer from contaminated fomites to fingers after the use of disinfectant wipe intervention. The data generated in the present study can be used in quantitative microbial risk assessment models to predict the effect of disinfectant wipes in reducing microbial exposure.

2013 ◽  
Vol 79 (18) ◽  
pp. 5728-5734 ◽  
Author(s):  
Gerardo U. Lopez ◽  
Charles P. Gerba ◽  
Akrum H. Tamimi ◽  
Masaaki Kitajima ◽  
Sheri L. Maxwell ◽  
...  

ABSTRACTFomites can serve as routes of transmission for both enteric and respiratory pathogens. The present study examined the effect of low and high relative humidity on fomite-to-finger transfer efficiency of five model organisms from several common inanimate surfaces (fomites). Nine fomites representing porous and nonporous surfaces of different compositions were studied.Escherichia coli,Staphylococcus aureus,Bacillus thuringiensis, MS2 coliphage, and poliovirus 1 were placed on fomites in 10-μl drops and allowed to dry for 30 min under low (15% to 32%) or high (40% to 65%) relative humidity. Fomite-to-finger transfers were performed using 1.0 kg/cm2of pressure for 10 s. Transfer efficiencies were greater under high relative humidity for both porous and nonporous surfaces. Most organisms on average had greater transfer efficiencies under high relative humidity than under low relative humidity. Nonporous surfaces had a greater transfer efficiency (up to 57%) than porous surfaces (<6.8%) under low relative humidity, as well as under high relative humidity (nonporous, up to 79.5%; porous, <13.4%). Transfer efficiency also varied with fomite material and organism type. The data generated can be used in quantitative microbial risk assessment models to assess the risk of infection from fomite-transmitted human pathogens and the relative levels of exposure to different types of fomites and microorganisms.


Author(s):  
Annalaura Carducci ◽  
Gabriele Donzelli ◽  
Lorenzo Cioni ◽  
Ileana Federigi ◽  
Roberto Lombardi ◽  
...  

Biological risk assessment in occupational settings currently is based on either qualitative or semiquantitative analysis. In this study, a quantitative microbial risk assessment (QMRA) has been applied to estimate the human adenovirus (HAdV) health risk due to bioaerosol exposure in a wastewater treatment plant (WWTP). A stochastic QMRA model was developed considering HAdV as the index pathogen, using its concentrations in different areas and published dose–response relationship for inhalation. A sensitivity analysis was employed to examine the impact of input parameters on health risk. The QMRA estimated a higher average risk in sewage influent and biological oxidation tanks (15.64% and 12.73% for an exposure of 3 min). Sensitivity analysis indicated HAdV concentration as a predominant factor in the estimated risk. QMRA results were used to calculate the exposure limits considering four different risk levels (one illness case per 100, 1.000, 10.000, and 100.000 workers): for 3 min exposures, we obtained 565, 170, 54, and 6 GC/m3 of HAdV. We also calculated the maximum time of exposure for each level for different areas. Our findings can be useful to better define the effectiveness of control measures, which would thus reduce the virus concentration or the exposure time.


Meat Science ◽  
2006 ◽  
Vol 74 (1) ◽  
pp. 76-88 ◽  
Author(s):  
Geraldine Duffy ◽  
Enda Cummins ◽  
Pádraig Nally ◽  
Stephen O’ Brien ◽  
Francis Butler

2018 ◽  
Vol 84 (6) ◽  
pp. e02093-17 ◽  
Author(s):  
Miguel F. Varela ◽  
Imen Ouardani ◽  
Tsuyoshi Kato ◽  
Syunsuke Kadoya ◽  
Mahjoub Aouni ◽  
...  

ABSTRACTSapovirus(SaV), from theCaliciviridaefamily, is a genus of enteric viruses that cause acute gastroenteritis. SaV is shed at high concentrations with feces into wastewater, which is usually discharged into aquatic environments or reused for irrigation without efficient treatments. This study analyzed the incidence of human SaV in four wastewater treatment plants from Tunisia during a period of 13 months (December 2009 to December 2010). Detection and quantification were carried out using reverse transcription-quantitative PCR (RT-qPCR) methods, obtaining a prevalence of 39.9% (87/218). Sixty-one positive samples were detected in untreated water and 26 positive samples in processed water. The Dekhila plant presented the highest contamination levels, with a 63.0% prevalence. A dominance of genotype I.2 was observed on 15 of the 24 positive samples that were genetically characterized. By a Bayesian estimation algorithm, the SaV density in wastewater was estimated using left-censored data sets. The mean value of log SaV concentration in untreated wastewater ranged between 2.7 and 4.5 logs. A virus removal efficiency of 0.2 log was calculated for the Dekhila plant as the log ratio posterior distributions between untreated and treated wastewater. Multiple quantitative values obtained in this study must be available in quantitative microbial risk assessment in Tunisia as parameter values reflecting local conditions.IMPORTANCEHuman sapovirus (SaV) is becoming more prevalent worldwide and organisms in this genus are recognized as emerging pathogens associated with human gastroenteritis. The present study describes novel findings on the prevalence, seasonality, and genotype distribution of SaV in Tunisia and Northern Africa. In addition, a statistical approximation using Bayesian estimation of the posterior predictive distribution (“left-censored” data) was employed to solve methodological problems related with the limit of quantification of the quantitative PCR (qPCR). This approach would be helpful for the future development of quantitative microbial risk assessment procedures for wastewater.


2006 ◽  
Vol 72 (5) ◽  
pp. 3284-3290 ◽  
Author(s):  
Andrew J. Hamilton ◽  
Frank Stagnitti ◽  
Robert Premier ◽  
Anne-Maree Boland ◽  
Glenn Hale

ABSTRACT Quantitative microbial risk assessment models for estimating the annual risk of enteric virus infection associated with consuming raw vegetables that have been overhead irrigated with nondisinfected secondary treated reclaimed water were constructed. We ran models for several different scenarios of crop type, viral concentration in effluent, and time since last irrigation event. The mean annual risk of infection was always less for cucumber than for broccoli, cabbage, or lettuce. Across the various crops, effluent qualities, and viral decay rates considered, the annual risk of infection ranged from 10−3 to 10−1 when reclaimed-water irrigation ceased 1 day before harvest and from 10−9 to 10−3 when it ceased 2 weeks before harvest. Two previously published decay coefficients were used to describe the die-off of viruses in the environment. For all combinations of crop type and effluent quality, application of the more aggressive decay coefficient led to annual risks of infection that satisfied the commonly propounded benchmark of ≤10−4, i.e., one infection or less per 10,000 people per year, providing that 14 days had elapsed since irrigation with reclaimed water. Conversely, this benchmark was not attained for any combination of crop and water quality when this withholding period was 1 day. The lower decay rate conferred markedly less protection, with broccoli and cucumber being the only crops satisfying the 10−4 standard for all water qualities after a 14-day withholding period. Sensitivity analyses on the models revealed that in nearly all cases, variation in the amount of produce consumed had the most significant effect on the total uncertainty surrounding the estimate of annual infection risk. The models presented cover what would generally be considered to be worst-case scenarios: overhead irrigation and consumption of vegetables raw. Practices such as subsurface, furrow, or drip irrigation and postharvest washing/disinfection and food preparation could substantially lower risks and need to be considered in future models, particularly for developed nations where these extra risk reduction measures are more common.


2010 ◽  
Vol 73 (2) ◽  
pp. 274-285 ◽  
Author(s):  
E. FRANZ ◽  
S. O. TROMP ◽  
H. RIJGERSBERG ◽  
H. J. van der FELS-KLERX

Fresh vegetables are increasingly recognized as a source of foodborne outbreaks in many parts of the world. The purpose of this study was to conduct a quantitative microbial risk assessment for Escherichia coli O157:H7, Salmonella, and Listeria monocytogenes infection from consumption of leafy green vegetables in salad from salad bars in The Netherlands. Pathogen growth was modeled in Aladin (Agro Logistics Analysis and Design Instrument) using time-temperature profiles in the chilled supply chain and one particular restaurant with a salad bar. A second-order Monte Carlo risk assessment model was constructed (using @Risk) to estimate the public health effects. The temperature in the studied cold chain was well controlled below 5°C. Growth of E. coli O157:H7 and Salmonella was minimal (17 and 15%, respectively). Growth of L. monocytogenes was considerably greater (194%). Based on first-order Monte Carlo simulations, the average number of cases per year in The Netherlands associated the consumption leafy greens in salads from salad bars was 166, 187, and 0.3 for E. coli O157:H7, Salmonella, and L. monocytogenes, respectively. The ranges of the average number of annual cases as estimated by second-order Monte Carlo simulation (with prevalence and number of visitors as uncertain variables) were 42 to 551 for E. coli O157:H7, 81 to 281 for Salmonella, and 0.1 to 0.9 for L. monocytogenes. This study included an integration of modeling pathogen growth in the supply chain of fresh leafy vegetables destined for restaurant salad bars using software designed to model and design logistics and modeling the public health effects using probabilistic risk assessment software.


2016 ◽  
Vol 74 (3) ◽  
pp. 749-755 ◽  
Author(s):  
P. Makkaew ◽  
M. Miller ◽  
H. J. Fallowfield ◽  
N. J. Cromar

This study assessed the contamination of Escherichia coli, in lettuce grown with treated domestic wastewater in four different irrigation configurations: open spray, spray under plastic sheet cover, open drip and drip under plastic sheet cover. Samples of lettuce from each irrigation configuration and irrigating wastewater were collected during the growing season. No E. coli was detected in lettuce from drip irrigated beds. All lettuce samples from spray beds were positive for E. coli, however, no statistical difference (p &gt; 0.05) was detected between lettuces grown in open spray or covered spray beds. The results from the field experiment were also compared to a laboratory experiment which used submersion of lettuce in wastewater of known E. coli concentration as a surrogate method to assess contamination following irrigation. The microbial quality of spray bed lettuces was not significantly different from submersed lettuce when irrigated with wastewater containing 1,299.7 E. coli MPN/100 mL (p &gt; 0.05). This study is significant since it is the first to validate that the microbial contamination of lettuce irrigated with wastewater in the field is comparable with a laboratory technique frequently applied in the quantitative microbial risk assessment of the consumption of wastewater irrigated salad crops.


2016 ◽  
Vol 82 (15) ◽  
pp. 4743-4756 ◽  
Author(s):  
Graham S. Banting ◽  
Shannon Braithwaite ◽  
Candis Scott ◽  
Jinyong Kim ◽  
Byeonghwa Jeon ◽  
...  

ABSTRACTCampylobacterspp. are the leading cause of bacterial gastroenteritis worldwide, and water is increasingly seen as a risk factor in transmission. Here we describe a most-probable-number (MPN)–quantitative PCR (qPCR) assay in which water samples are centrifuged and aliquoted into microtiter plates and the bacteria are enumerated by qPCR. We observed that commonly usedCampylobactermolecular assays produced vastly different detection rates. In irrigation water samples, detection rates varied depending upon the PCR assay and culture method used, as follows: 0% by the de Boer Lv1-16S qPCR assay, 2.5% by the Van Dyke 16S and JensenglyAqPCR assays, and 75% by the Linton 16S endpoint PCR when cultured at 37°C. Primer/probe specificity was the major confounder, withArcobacterspp. routinely yielding false-positive results. The primers and PCR conditions described by Van Dyke et al. (M. I. Van Dyke, V. K. Morton, N. L. McLellan, and P. M. Huck, J Appl Microbiol 109:1053–1066, 2010,http://dx.doi.org/10.1111/j.1365-2672.2010.04730.x) proved to be the most sensitive and specific forCampylobacterdetection in water.Campylobacteroccurrence in irrigation water was found to be very low (<2 MPN/300 ml) when thisCampylobacter-specific qPCR was used, with the most commonly detected species beingC. jejuni,C. coli, andC. lari. Campylobacters in raw sewage were present at ∼102/100 ml, with incubation at 42°C required for reducing microbial growth competition from arcobacters. Overall, whenCampylobacterprevalence and/or concentration in water is reported using molecular methods, considerable validation is recommended when adapting methods largely developed for clinical applications. Furthermore, combining MPN methods with molecular biology-based detection algorithms allows for the detection and quantification ofCampylobacterspp. in environmental samples and is potentially suited to quantitative microbial risk assessment for improved public health disease prevention related to food and water exposures.IMPORTANCEThe results of this study demonstrate the importance of assay validation upon data interpretation of environmental monitoring forCampylobacterwhen using molecular biology-based assays. Previous studies describingCampylobacterprevalence in Canada utilized primers that we have determined to be nonspecific due to their cross-amplification ofArcobacterspp. As such,Campylobacterprevalence may have been vastly overestimated in other studies. Additionally, the development of a quantitative assay described in this study will allow accurate determination ofCampylobacterconcentrations in environmental water samples, allowing more informed decisions to be made about water usage based on quantitative microbial risk assessment.


Sign in / Sign up

Export Citation Format

Share Document