Thermal and Nonthermal Factors Affecting Survival of Salmonella and Listeria monocytogenes in Animal Manure–Based Compost Mixtures

2014 ◽  
Vol 77 (9) ◽  
pp. 1512-1518 ◽  
Author(s):  
M. C. ERICKSON ◽  
J. LIAO ◽  
L. MA ◽  
X. JIANG ◽  
M. P. DOYLE

Reduction of enteric pathogens in animal manures before field application is essential for mitigating the risk of foodborne illness associated with produce. Aerobic composting of manures has been advocated as an effective treatment for reducing pathogen populations, and heat is a major factor contributing to pathogen inactivation. This study was initiated to determine the potential contribution of both thermal and nonthermal (pH, volatile acids, and ammonia) factors to pathogen inactivation during aerobic composting in bioreactors for mixtures containing manure from various sources (dairy, chicken, and swine). The test mixtures were formulated with an initial moisture content of 60% and a C:N ratio of 20:1, using straw and cottonseed meal as amendments. Mixtures were then inoculated with Salmonella and Listeria monocytogenes labeled with green fluorescent protein at initial populations of ca. 107 CFU/g. Three replicate trials of each treatment were conducted. Temperatures within the bioreactors were recorded at 30-min intervals, and duplicate samples were withdrawn daily from two sampling locations within the bioreactor. Significant regression models were derived relating decreases in pathogen populations to the degree of heat generated in the mixture (cumulative heat) and the pH of the mixture on the day before the pathogen losses were calculated (P < 0.0002). Although pathogens in swine manure compost mixtures were inactivated by the third day of composting, very little heat was generated in these mixtures, which were characterized by significantly higher levels of volatile acids compared with the other two compost mixtures. Therefore, volatile acids could help achieve pathogen inactivation when temperatures are too low such as when heat is lost too quickly at the surface of static compost piles or during winter composting.

2015 ◽  
Vol 78 (2) ◽  
pp. 302-310 ◽  
Author(s):  
MARILYN C. ERICKSON ◽  
CHRIS SMITH ◽  
XIUPING JIANG ◽  
IAN D. FLITCROFT ◽  
MICHAEL P. DOYLE

Heat is the primary mechanism by which aerobic composting inactivates zoonotic bacterial pathogens residing within animal manures, but at sublethal temperatures, the time necessary to hold the compost materials to ensure pathogen inactivation is uncertain. To determine the influence of the type of nitrogen amendment on inactivation of Salmonella, Listeria monocytogenes, and Escherichia coli O157:H7 in compost mixtures stored at sublethal temperatures, specific variables investigated in these studies included the animal source of the manure, the initial carbon/nitrogen (C:N) ratio of the compost mixture, and the age of the manure. Salmonella and L. monocytogenes were both inactivated more rapidly in chicken and swine compost mixtures stored at 20°C when formulated to an initial C:N ratio of 20:1 compared with 40:1, whereas a C:N ratio did not have an effect on inactivation of these pathogens in cow compost mixtures. Pathogen inactivation was related to the elevated pH of the samples that likely arises from ammonia produced by the indigenous microflora in the compost mixtures. Indigenous microbial activity was reduced when compost mixtures were stored at 30°C and drier conditions (<10% moisture level) were prevalent. Furthermore, under these drier conditions, Salmonella persisted to a greater extent than L. monocytogenes, and the desiccation resistance of Salmonella appeared to convey cross-protection to ammonia. Salmonella persisted longer in compost mixtures prepared with aged chicken litter compared with fresh chicken litter, whereas E. coli O157:H7 survived to similar extents in compost mixtures prepared with either fresh or aged cow manure. The different responses observed when different sources of manure were used in compost mixtures reveal that guidelines with times required for pathogen inactivation in compost mixtures stored at sublethal temperatures should be dependent on the source of nitrogen, i.e., type of animal manure, present.


Author(s):  
D. C. Preethu ◽  
S. M. Savita ◽  
M. S. Dinesha ◽  
B. S. Rajendra Prasad ◽  
Lata R. Kulkarni

Aim: The aim of this study was to evaluate effectiveness of various microbial compost cultures for aerobic-composting of farm wastes. Place of Study: Three trials were conducted on farmer’s field and one at Krishi Vigyana Kendra (KVK) Ramanagara district. Methodology: During the composting process, days to compost, maturity in terms of changes in temperature, pH and composting dynamics were studied. Compost quality parameters such as macro and micro-nutrients and C:N ratio and stability  of the compost were recorded at different intervals.  Results: The results showed that the compost culture from  IIHR and UASB had taken 90 and 105 days respectively, for complete stabilization; further had relatively higher temperature and pH during the initial phase and reached ambient condition at maturity stage, C:N ratio has showed gradual reduction from 39.65 to 15.98 and 39.75 to 13.66% respectively in IIHR and UASB cultures, they also had high macro, secondary and micro nutrients(IIHR-1.55% N, 0.93% P, 0.95% K, 4.39% Ca, 0.69% Mg, 0.19%S, 930 ppm Fe, 10ppm Cu, 305ppm Mn, 82ppm Zn, 26 ppm B  UASB-1.59% N, 0.91% P, 0.97% K, 4.25%Ca, 0.88% Mg, 0.21%S, 948 ppm Fe, 9ppm Cu, 325ppm Mn, 93ppm Zn, 28ppm B) content and resulted in more compost production ( 3.3 and 2.8 t/year, respectively) with B:C ratio of 6.67 and 7.25 respectively when compared to NCOF (T3) and farmers practice (T4). Conclusion: Aerobic-composting of farm waste using microbial culture of UASB and IIHR proved to be an effective technology that aids to convert organic farm waste into valuable organic manure with an advantage of minimizing the environmental contamination associated with burning of residues.


Electronics ◽  
2018 ◽  
Vol 7 (12) ◽  
pp. 347 ◽  
Author(s):  
Maria Chiriacò ◽  
Ilaria Parlangeli ◽  
Fausto Sirsi ◽  
Palmiro Poltronieri ◽  
Elisabetta Primiceri

A great improvement in food safety and quality controls worldwide has been achieved through the development of biosensing platforms. Foodborne pathogens continue to cause serious outbreaks, due to the ingestion of contaminated food. The development of new, sensitive, portable, high-throughput, and automated platforms is a primary objective to allow detection of pathogens and their toxins in foods. Listeria monocytogenes is one common foodborne pathogen. Major outbreaks of listeriosis have been caused by a variety of foods, including milk, soft cheeses, meat, fermented sausages, poultry, seafood and vegetable products. Due to its high sensitivity and easy setup, electrochemical impedance spectroscopy (EIS) has been extensively applied for biosensor fabrication and in particular in the field of microbiology as a mean to detect and quantify foodborne bacteria. Here we describe a miniaturized, portable EIS platform consisting of a microfluidic device with EIS sensors for the detection of L. monocytogenes in milk samples, connected to a portable impedance analyzer for on-field application in clinical and food diagnostics, but also for biosecurity purposes. To achieve this goal microelectrodes were functionalized with antibodies specific for L. monocytogenes. The binding and detection of L. monocytogenes was achieved in the range 2.2 × 103 cfu/mL to 1 × 102 with a Limit of Detection (LoD) of 5.5 cfu/mL.


2006 ◽  
Vol 69 (4) ◽  
pp. 794-800 ◽  
Author(s):  
K. K. NIGHTINGALE ◽  
H. THIPPAREDDI ◽  
R. K. PHEBUS ◽  
J. L. MARSDEN ◽  
A. L. NUTSCH

Italian-style salami batter (formulated with pork shoulder) was inoculated with ca. 7.0 log CFU/g of either Salmonella or Listeria monocytogenes. Salami links (55-mm cellulose casings) were fermented at 30°C for 24, 40, or 72 h and then dried to target moisture/protein ratios (MPRs) of 1.9:1 or 1.4:1. Links were sampled after fermentation (24, 40, and 72 h) and after combined fermentation-drying treatments (MPRs of 1.9:1 and 1.4:1 for all fermentation periods), and microbiological and proximate analyses were performed at each sampling. Pathogen populations were enumerated by direct plating on selective agar and by an injured-cell recovery method. When enumerated by the injured-cell recovery method, Salmonella populations were reduced by 1.2 to 2.1 log CFU/g after fermentation alone (24 to 72 h) and by 2.4 to 3.4 log CFU/g when fermentation was followed by drying. Drying to an MPR of 1.4:1 was no more effective than drying to an MPR of 1.9:1 (P > 0.05). When enumerated directly on selective media, Salmonella populations were reduced from 1.6 to 2.4 log CFU/g and from 3.6 to 4.5 log CFU/g for fermentation alone and fermentation followed by drying, respectively. L. monocytogenes populations were reduced by <1.0 log CFU/g following all fermentation and combined fermentation-drying treatments, regardless of the enumeration method. These results suggest that the Italian-style salami manufacturing process evaluated does not adequately reduce high pathogen loads. Processors may thus need to consider supplemental measures, such as raw material specifications and a final heating step, to enhance the lethality of the overall manufacturing process.


Plant Disease ◽  
2000 ◽  
Vol 84 (2) ◽  
pp. 177-181 ◽  
Author(s):  
William S. Conway ◽  
Britta Leverentz ◽  
Robert A. Saftner ◽  
Wojciech J. Janisiewicz ◽  
Carl E. Sams ◽  
...  

The food-borne human pathogen Listeria monocytogenes survived and its populations increased on cv. Delicious apple slices at 10 or 20°C in air or controlled atmosphere of 0.5% O2 and 15% CO2, but did not grow at 5°C. Controlled atmosphere had no significant effect on the survival or growth of L. monocytogenes. The pathogen populations declined over time when grown in various concentrations of apple juice and the decline was greater as the concentration of the juice decreased. Populations of L. monocytogenes inoculated into decayed apple tissue continually increased on fruit decayed by Glomerella cingulata but did not survive after 5 days on fruit decayed by Penicillium expansum. The pH of the decayed area declined from pH 4.7 to 3.7 in the case of P. expansum, but in the case of G. cingulata the pH increased from pH 4.7 to 7.0. This pH modification may be responsible for affecting the growth of the food-borne pathogen. Storage temperature, as well as the absence of postharvest pathogens such as G. cingulata, is important for maintaining the safety of fresh-cut apples.


2014 ◽  
Vol 81 (1) ◽  
pp. 113-119 ◽  
Author(s):  
Bjørn C T Schirmer ◽  
Even Heir ◽  
Bjørn-Arne Lindstedt ◽  
Trond Møretrø ◽  
Solveig Langsrud

The aim of the study was to investigate how the use of fresh cheese brines compared with used brines and various combinations of pH and NaCl concentrations affected the survival of Listeria monocytogenes. Cheese brines from five Norwegian small scale cheese producers were analysed and showed great variations in pH (4·54–6·01) and NaCl concentrations (14·1–26·9 %). The survival of five strains of List. monocytogenes (two clinical isolates, two food isolates and one animal isolate) in four different cheese brines (three used and one fresh) was investigated. Results showed significant differences in survival both depending on the strains and the brines. Strains of human outbreak listeriosis cases showed greater ability to survive in the brines compared with food isolates and a List. monocytogenes reference strain (1–2 log10 difference after 200 d). All strains showed highest survival in the freshly prepared brine compared with the used brines. Molecular typing by multiple locus variable number tandem repeats analysis (MLVA) showed that there were no detectable alterations in the examined variable number tandem repeats of the genome in five strains after 200 d storage in any of the salt brines. Combined effects of pH (4·5, 5·25 and 6·0) and NaCl (15, 20 and 25 %) in fresh, filter sterilised brines on the survival of List. monocytogenes were examined and results showed that pathogen populations decreased over time in all brines. Death rates at any given NaCl concentration were highest at low pH (4·5) and death rates at any given pH were highest at low NaCl concentrations (15 %). In conclusion, the use of used brines reduced the survival of List. monocytogenes and a combination of low pH (4·5) and low salt concentrations (15 %) decreased the risk of List. monocytogenes survival compared with higher pH (5·25 or 6·0) and higher NaCl concentrations (20 or 25 %).


2007 ◽  
Vol 70 (2) ◽  
pp. 378-385 ◽  
Author(s):  
ALEXANDRA LIANOU ◽  
IFIGENIA GEORNARAS ◽  
PATRICIA A. KENDALL ◽  
KEITH E. BELK ◽  
JOHN A. SCANGA ◽  
...  

Commercial cured ham formulated with or without potassium lactate and sodium diacetate was inoculated with Listeria monocytogenes and stored to simulate conditions of processing, retail, and home storage. The ham was sliced, inoculated with a 10-strain composite of L. monocytogenes (1 to 2 log CFU/cm2), vacuum packaged, and stored at 4°C to simulate contamination following lethality treatment at processing (first shelf life). After 10, 20, 35, and 60 days of storage, packages were opened, samples were tested, and bags with remaining slices were reclosed with rubber bands. At the same times, portions of original product (stored at 4°C in original processing bags) were sliced, inoculated, and packaged in delicatessen bags to simulate contamination during slicing at retail (second shelf life). Aerobic storage of both sets of packages at 7°C for 12 days was used to reflect domestic storage conditions (home storage). L. monocytogenes populations were lower (P < 0.05) during storage in ham formulated with lactate-diacetate than in product without antimicrobials under both contamination scenarios. Inoculation of ham without lactate-diacetate allowed prolific growth of L. monocytogenes in vacuum packages during the first shelf life and was the worst case contamination scenario with respect to pathogen numbers encountered during home storage. Under the second shelf life contamination scenario, mean growth rates of the organism during home storage ranged from 0.32 to 0.45 and from 0.18 to 0.25 log CFU/cm2/day for ham without and with lactate-diacetate, respectively, and significant increases in pathogen numbers (P < 0.05) were generally observed after 4 and 8 days of storage, respectively. Regardless of contamination scenario, 12-day home storage of product without lactate-diacetate resulted in similar pathogen populations (6.0 to 6.9 log CFU/cm2)(P ≥ 0.05). In ham containing lactate-diacetate, similar counts were found during the home storage experiment under both contamination scenarios, and only in 60-day-old product did samples from the first shelf life have higher (P < 0.05) pathogen numbers than those found in samples from the second shelf life. These results should be useful in risk assessments and for the establishment of “sell by” and “consume by” date labels for refrigerated ready-to-eat meat products.


2001 ◽  
Vol 64 (11) ◽  
pp. 1679-1689 ◽  
Author(s):  
PEGGY P. MAK ◽  
BARBARA H. INGHAM ◽  
STEVEN C. INGHAM

Time and temperature pasteurization conditions common in the Wisconsin cider industry were validated using a six-strain cocktail of Escherichia coli O157:H7 and acid-adapted E. coli O157:H7 in pH- and ∘Brix-adjusted apple cider. Strains employed were linked to outbreaks (ATCC 43894 and 43895, C7927, and USDA-FSIS-380–94) or strains engineered to contain the gene for green fluorescent protein (pGFP ATCC 43894 and pGFP ATCC 43889) for differential enumeration. Survival of Salmonella spp. (CDC 0778, CDC F2833, and CDC HO662) and Listeria monocytogenes (H0222, F8027, and F8369) was also evaluated. Inoculated cider of pH 3.3 or 4.1 and 11 or 14°Brix was heated under conditions ranging from 60°C for 14 s to 71.1°C for 14 s. A 5-log reduction of nonadapted and acid-adapted E. coli O157:H7 was obtained at 68.1°C for 14 s. Lower temperatures, or less time at 68.1°C, did not ensure a 5-log reduction in E. coli O157:H7. A 5-log reduction was obtained at 65.6°C for 14 s for Salmonella spp. L. monocytogenes survived 68.1°C for 14 s, but survivors died in cider within 24 h at 4°C. Laboratory results were validated with a surrogate E. coli using a bench-top plate heat-exchange pasteurizer. Results were further validated using fresh unpasteurized commercial ciders. Consumer acceptance of cider pasteurized at 68.1°C for 14 s (Wisconsin recommendations) and at 71.1°C for 6 s (New York recommendations) was not significantly different. Hence, we conclude that 68.1°C for 14 s is a validated treatment for ensuring adequate destruction of E. coli O157:H7, Salmonella spp., and L. monocytogenes in apple cider.


2012 ◽  
Vol 92 (2) ◽  
pp. 315-327 ◽  
Author(s):  
Kumudinie A. Kariyapperuma ◽  
Adriana Furon ◽  
Claudia Wagner-Riddle

Kariyapperuma, K. A., Furon, A. and Wagner-Riddle, C. 2012. Non-growing season nitrous oxide fluxes from an agricultural soil as affected by application of liquid and composted swine manure. Can. J. Soil Sci. 92: 315–327. Agricultural soils have been recognized as a significant source of anthropogenic nitrous oxide (N2O) emissions, an important greenhouse gas and contributor to stratospheric ozone destruction. Application of liquid swine manure (LSM) has been reported to increase direct N2O emissions from agricultural soils. Composting of LSM with straw under forced aeration has been suggested as a mitigation practice for emissions of N2O. In cold climates, up to 70% of total annual soil N2O emissions have been observed during winter and spring thaw. Non-growing season soil N2O emissions after field application of composted swine manure (CSM) versus LSM have not been directly compared in past studies. A 2-yr field experiment was conducted at the Arkell Research Station, Ontario, Canada, as a part of a larger study to evaluate composting as a mitigation strategy for greenhouse gases (GHGs). The objectives were to quantify and compare non-growing season N2O fluxes from agricultural soils after fall application of LSM and CSM. Nitrous oxide fluxes were measured using the flux-gradient method. Compared with LSM, CSM resulted in 57% reduction of soil N2O emissions during February to April in 2005, but emissions during the same period in 2006 were not affected by treatments. This effect was related to fall and winter weather conditions with the significant reduction occurring in the year when soil freezing was more pronounced. Compared with LSM, CSM resulted in a reduction of 37% (CO2-eq) of estimated N2O emissions per liter of treated manure and of 50% in the emission factor for the non-growing season.


1997 ◽  
Vol 60 (5) ◽  
pp. 476-484 ◽  
Author(s):  
RANDALL K. PHEBUS ◽  
ABBEY L. NUTSCH ◽  
DAVID E. SCHAFER ◽  
R. CRAIG WILSON ◽  
M. JAMES RIEMANN ◽  
...  

The effectiveness of a recently invented “steam pasteurization” (S) process in reducing pathogenic bacterial populations on surfaces of freshly slaughtered beef was determined and compared with that of other standard commercial methods including knife trimming (T), water washing (35°C; W), hot water/steam vacuum spot cleaning (V), and spraying with 2% vol/vol lactic acid (54°C, pH 2.25; L). These decontamination treatments were tested individually and in combinations. Cutaneus trunci muscles from freshly slaughtered steers were inoculated with feces containing Listeria monocytogenes Scott A, Escherichia coli OI57:H7, and Salmonella typhimurium over a predesignated meat surface area, resulting in initial populations of ca. 5 log CFU/cm2 of each pathogen. Tissue samples were excised from each portion before and after decontamination treatments, and mean population reductions were determined. Treatment combinations evaluated were the following (treatment designations within the abbreviations indicate the order of application): TW, TWS, WS, VW, VWS, TWLS, and VWLS. These combinations resulted in reductions ranging from 3.5 to 5.3 log CFU/cm2 in all three pathogen populations. The TW, TWS, WS, TWLS, and VWLS combinations were equally effective (P > 0.05), resulting in reductions ranging from 4.2 to 5.3 log CFU/cm2. When used individually, T, V, and S resulted in pathogen reductions ranging from 2.5 to 3.7 log CFU/cm2 Steam pasteurization consistently provided numerically greater pathogen reductions than T or V. Treatments T, V, and S were all more effective than W (which gave a reduction on the order of 1.0 log CFU/cm2). Steam pasteurization is an effective method for reducing pathogenic bacterial populations on surfaces of freshly slaughtered beef, with multiple decontamination procedures providing greatest overall reductions.


Sign in / Sign up

Export Citation Format

Share Document