scholarly journals Modelling the Potential Risk of Infection Associated with Listeria monocytogenes in Irrigation Water and Agricultural Soil in Two District Municipalities in South Africa

2022 ◽  
Vol 10 (1) ◽  
pp. 181
Author(s):  
Chidozie Declan Iwu ◽  
Chinwe Juliana Iwu-Jaja ◽  
Rami Elhadi ◽  
Lucy Semerjian ◽  
Anthony Ifeanyin Okoh

Listeria monocytogenes (L. monocytogenes) is the etiologic agent of listeriosis which significantly affects immunocompromised individuals. The potential risk of infection attributed to L. monocytogenes in irrigation water and agricultural soil, which are key transmission pathways of microbial hazards to the human population, was evaluated using the quantitative microbial risk assessment modelling. A Monte Carlo simulation with 10,000 iterations was used to characterize the risks. High counts of L. monocytogenes in irrigation water (mean: 11.96 × 102 CFU/100 mL; range: 0.00 to 56.67 × 102 CFU/100 mL) and agricultural soil samples (mean: 19.64 × 102 CFU/g; range: 1.33 × 102 to 62.33 × 102 CFU/g) were documented. Consequently, a high annual infection risk of 5.50 × 10−2 (0.00 to 48.30 × 10−2), 54.50 × 10−2 (9.10 × 10−3 to 1.00) and 70.50 × 10−2 (3.60 × 10−2 to 1.00) was observed for adults exposed to contaminated irrigation water, adults exposed to contaminated agricultural soil and children exposed to agricultural soil, respectively. This study, therefore, documents a huge public health threat attributed to the high probability of infection in humans exposed to L. monocytogenes in irrigation water and agricultural soil in Amathole and Chris Hani District Municipalities in the Eastern Cape province of South Africa.

2020 ◽  
Vol 8 (8) ◽  
pp. 1206 ◽  
Author(s):  
Chidozie Declan Iwu ◽  
Erika M du Plessis ◽  
Lise Korsten ◽  
Nolonwabo Nontongana ◽  
Anthony Ifeanyi Okoh

This study was undertaken to evaluate the antibiogram fingerprints of some Enterobacteria recovered from irrigation water and agricultural soil in two District Municipalities of the Eastern Cape Province, South Africa using standard culture-based and molecular methods. The prevalent resistance patterns in the isolates follow the order: Salmonella enterica serovar Typhimurium [tetracycline (92.3%), ampicillin (69.2%)]; Enterobacter cloacae [amoxicillin/clavulanic acid (77.6%), ampicillin (84.5%), cefuroxime (81.0%), nitrofurantoin (81%), and tetracycline (80.3%)]; Klebsiella pneumoniae [amoxicillin/clavulanic acid (80.6%), ampicillin (88.9%), and cefuroxime (61.1%)]; and Klebsiella oxytoca [chloramphenicol (52.4%), amoxicillin/clavulanic acid (61.9%), ampicillin (61.9%), and nitrofurantoin (61.9%)]. Antibiotic resistance genes detected include tetC (86%), sulII (86%), and blaAmpC (29%) in Salmonella enterica serovar Typhimurium., tetA (23%), tetB (23%), tetC (12%), sulI (54%), sulII (54%), catII (71%), blaAmpC (86%), blaTEM (43%), and blaPER (17%) in Enterobacter cloacae., tetA (20%), tetC (20%), tetD (10%), sulI (9%), sulII (18%), FOX (11%) and CIT (11%)-type plasmid-mediated AmpC, blaTEM (11%), and blaSHV (5%) in Klebsiella pneumoniae and blaAmpC (18%) in Klebsiella oxytoca. Our findings document the occurrence of some antibiotic-resistant Enterobacteria in irrigation water and agricultural soil in Amathole and Chris Hani District Municipalities, Eastern Cape Province of South Africa, thus serving as a potential threat to food safety.


2006 ◽  
Vol 5 (1) ◽  
pp. 117-128 ◽  
Author(s):  
Caroline Schönning ◽  
Therese Westrell ◽  
Thor Axel Stenström ◽  
Karsten Arnbjerg-Nielsen ◽  
Arne Bernt Hasling ◽  
...  

Dry urine-diverting toilets may be used in order to collect excreta for the utilisation of nutrients. A quantitative microbial risk assessment was conducted in order to evaluate the risks of transmission of infectious disease related to the local use of faeces as a fertiliser. The human exposures evaluated included accidental ingestion of small amounts of faeces, or a mixture of faeces and soil, while emptying the storage container and applying the material in the garden, during recreational stays to the garden, and during gardening. A range of pathogens representing various groups of microorganisms was considered. Results showed that 12-months' storage before use was sufficient for the inactivation of most pathogens to acceptable levels. When working or spending time in the garden the annual risk of infection by Ascaris was still slightly above 10-4 in these scenarios, although the incidence rate for Ascaris is very low in the population in question. Measures to further reduce the hygienic risks include longer storage, or treatment, of the faeces. The results can easily be extended to other regions with different incidence rates.


Pathogens ◽  
2020 ◽  
Vol 9 (10) ◽  
pp. 778
Author(s):  
Umesh Adhikari ◽  
Elaheh Esfahanian ◽  
Jade Mitchell ◽  
Duane Charbonneau ◽  
Xiangyu Song ◽  
...  

Handwashing with soap is an effective and economical means to reduce the likelihood of Escherichia coli infection from indirect contact with contaminated surfaces during food preparation. The purpose of this study was to conduct a quantitative microbial risk assessment (QMRA) to evaluate the risk of infection from indirect contact with fomites contaminated with E. coli after hand washing with antimicrobial hand soaps. A Monte Carlo simulation was done with a total of 10,000 simulations to compare the effectiveness of two antimicrobial and one control (non-antimicrobial) bar soaps in reducing the exposure and infection risk compared to no hand washing. The numbers of E. coli on several fomites commonly found in household kitchens, as well as the transfer rates between fomites and onto fingertips, were collected from the literature and experimental data. The sponsor company provided the E. coli survival on hands after washing with antimicrobial and control soaps. A number of scenarios were evaluated at two different exposure doses (high and low). Exposure scenarios included transfer of E. coli between meat-to-cutting board surface-to-hands, meat-to-knife surface-to-hands, and from a countertop surface-to-hands, kitchen sponge-to-hands, hand towel-to-hands, and dishcloth-to-hands. Results showed that the risks of illness after washing with the control soap was reduced approximately 5-fold compared to no handwashing. Washing with antimicrobial soap reduced the risk of E. coli infection by an average of about 40-fold compared with no handwashing. The antimicrobial soaps ranged from 3 to 32 times more effective than the non-antimicrobial soap, depending on the specific exposure scenario. Importance: The Centers for Disease Control and Prevention indicate the yearly incidence rate of Shiga Toxin producing E. coli infections is about 1.7/100,000, with about 10% of cases leading to life-threatening hemolytic uremic syndrome and 3–5% leading to death. Our findings confirm handwashing with soap reduces the risks associated with indirect transmission of E. coli infection from contact with fomites during food preparation. Further, in these exposure scenarios, antimicrobial soaps were more effective overall than the non-antimicrobial soap in reducing exposure to E. coli and risk of infection.


2008 ◽  
Vol 6 (4) ◽  
pp. 461-471 ◽  
Author(s):  
Razak Seidu ◽  
Arve Heistad ◽  
Philip Amoah ◽  
Pay Drechsel ◽  
Petter D. Jenssen ◽  
...  

Quantitative Microbial Risk Assessment (QMRA) models with 10,000 Monte Carlo simulations were applied to ascertain the risks of rotavirus and Ascaris infections for farmers using different irrigation water qualities and consumers of lettuce irrigated with the different water qualities after allowing post-harvest handling. A tolerable risk (TR) of infection of 7.7 × 10−4 and 1 × 10−2 per person per year were used for rotavirus and Ascaris respectively. The risk of Ascaris infection was within a magnitude of 10−2 for farmers accidentally ingesting drain or stream irrigation water; ∼100 for farmers accidentally ingesting farm soil and 100 for farmers ingesting any of the irrigation waters and contaminated soil. There was a very low risk (10−5) of Ascaris infection for farmers using pipe−water. For consumers, the annual risks of Ascaris and rotavirus infections were 100 and 10−3 for drain and stream irrigated lettuce respectively with slight increases for rotavirus infections along the post-harvest handling chain. Pipe irrigated lettuce recorded a rotavirus infection of 10−4 with no changes due to post harvest handling. The assessment identified on-farm soil contamination as the most significant health hazard.


2005 ◽  
Vol 68 (5) ◽  
pp. 913-918 ◽  
Author(s):  
SCOTT W. STINE ◽  
INHONG SONG ◽  
CHRISTOPHER Y. CHOI ◽  
CHARLES P. GERBA

Microbial contamination of the surfaces of cantaloupe, iceberg lettuce, and bell peppers via contact with irrigation water was investigated to aid in the development of irrigation water quality standards for enteric bacteria and viruses. Furrow and subsurface drip irrigation methods were evaluated with the use of nonpathogenic surrogates, coliphage PRD1, and Escherichia coli ATCC 25922. The concentrations of hepatitis A virus (HAV) and Salmonella in irrigation water necessary to achieve a 1:10,000 annual risk of infection, the acceptable level of risk used for drinking water by the U.S. Environmental Protection Agency, were calculated with a quantitative microbial risk assessment approach. These calculations were based on the transfer of the selected nonpathogenic surrogates to fresh produce via irrigation water, as well as previously determined preharvest inactivation rates of pathogenic microorganisms on the surfaces of fresh produce. The risk of infection was found to be variable depending on type of crop, irrigation method, and days between last irrigation event and harvest. The worst-case scenario, in which produce is harvested and consumed the day after the last irrigation event and maximum exposure is assumed, indicated that concentrations of 2.5 CFU/100 ml of Salmonella and 2.5 × 10−5 most probable number per 100 ml of HAV in irrigation water would result in an annual risk of 1:10,000 when the crop was consumed. If 14 days elapsed before harvest, allowing for die-off of the pathogens, the concentrations were increased to 5.7 × 103 Salmonella per 100 ml and 9.9 × 10−3 HAV per 100 ml.


2006 ◽  
Vol 72 (5) ◽  
pp. 3284-3290 ◽  
Author(s):  
Andrew J. Hamilton ◽  
Frank Stagnitti ◽  
Robert Premier ◽  
Anne-Maree Boland ◽  
Glenn Hale

ABSTRACT Quantitative microbial risk assessment models for estimating the annual risk of enteric virus infection associated with consuming raw vegetables that have been overhead irrigated with nondisinfected secondary treated reclaimed water were constructed. We ran models for several different scenarios of crop type, viral concentration in effluent, and time since last irrigation event. The mean annual risk of infection was always less for cucumber than for broccoli, cabbage, or lettuce. Across the various crops, effluent qualities, and viral decay rates considered, the annual risk of infection ranged from 10−3 to 10−1 when reclaimed-water irrigation ceased 1 day before harvest and from 10−9 to 10−3 when it ceased 2 weeks before harvest. Two previously published decay coefficients were used to describe the die-off of viruses in the environment. For all combinations of crop type and effluent quality, application of the more aggressive decay coefficient led to annual risks of infection that satisfied the commonly propounded benchmark of ≤10−4, i.e., one infection or less per 10,000 people per year, providing that 14 days had elapsed since irrigation with reclaimed water. Conversely, this benchmark was not attained for any combination of crop and water quality when this withholding period was 1 day. The lower decay rate conferred markedly less protection, with broccoli and cucumber being the only crops satisfying the 10−4 standard for all water qualities after a 14-day withholding period. Sensitivity analyses on the models revealed that in nearly all cases, variation in the amount of produce consumed had the most significant effect on the total uncertainty surrounding the estimate of annual infection risk. The models presented cover what would generally be considered to be worst-case scenarios: overhead irrigation and consumption of vegetables raw. Practices such as subsurface, furrow, or drip irrigation and postharvest washing/disinfection and food preparation could substantially lower risks and need to be considered in future models, particularly for developed nations where these extra risk reduction measures are more common.


2016 ◽  
Vol 82 (15) ◽  
pp. 4743-4756 ◽  
Author(s):  
Graham S. Banting ◽  
Shannon Braithwaite ◽  
Candis Scott ◽  
Jinyong Kim ◽  
Byeonghwa Jeon ◽  
...  

ABSTRACTCampylobacterspp. are the leading cause of bacterial gastroenteritis worldwide, and water is increasingly seen as a risk factor in transmission. Here we describe a most-probable-number (MPN)–quantitative PCR (qPCR) assay in which water samples are centrifuged and aliquoted into microtiter plates and the bacteria are enumerated by qPCR. We observed that commonly usedCampylobactermolecular assays produced vastly different detection rates. In irrigation water samples, detection rates varied depending upon the PCR assay and culture method used, as follows: 0% by the de Boer Lv1-16S qPCR assay, 2.5% by the Van Dyke 16S and JensenglyAqPCR assays, and 75% by the Linton 16S endpoint PCR when cultured at 37°C. Primer/probe specificity was the major confounder, withArcobacterspp. routinely yielding false-positive results. The primers and PCR conditions described by Van Dyke et al. (M. I. Van Dyke, V. K. Morton, N. L. McLellan, and P. M. Huck, J Appl Microbiol 109:1053–1066, 2010,http://dx.doi.org/10.1111/j.1365-2672.2010.04730.x) proved to be the most sensitive and specific forCampylobacterdetection in water.Campylobacteroccurrence in irrigation water was found to be very low (<2 MPN/300 ml) when thisCampylobacter-specific qPCR was used, with the most commonly detected species beingC. jejuni,C. coli, andC. lari. Campylobacters in raw sewage were present at ∼102/100 ml, with incubation at 42°C required for reducing microbial growth competition from arcobacters. Overall, whenCampylobacterprevalence and/or concentration in water is reported using molecular methods, considerable validation is recommended when adapting methods largely developed for clinical applications. Furthermore, combining MPN methods with molecular biology-based detection algorithms allows for the detection and quantification ofCampylobacterspp. in environmental samples and is potentially suited to quantitative microbial risk assessment for improved public health disease prevention related to food and water exposures.IMPORTANCEThe results of this study demonstrate the importance of assay validation upon data interpretation of environmental monitoring forCampylobacterwhen using molecular biology-based assays. Previous studies describingCampylobacterprevalence in Canada utilized primers that we have determined to be nonspecific due to their cross-amplification ofArcobacterspp. As such,Campylobacterprevalence may have been vastly overestimated in other studies. Additionally, the development of a quantitative assay described in this study will allow accurate determination ofCampylobacterconcentrations in environmental water samples, allowing more informed decisions to be made about water usage based on quantitative microbial risk assessment.


Sign in / Sign up

Export Citation Format

Share Document