Wet vs. dry inoculation methods have a significant effect of Listeria monocytogenes growth on many types of whole intact fresh produce

Author(s):  
Donald Schaffner ◽  
Marina Girbal ◽  
Laura K. Strawn ◽  
Claire M. Murphy

L. monocytogenes causes relatively few outbreaks linked to whole fresh produce but triggers recalls each year in the US. There are limited data on the influence of wet vs. dry methods on pathogen growth on whole produce. A cocktail of five L. monocytogenes strains that included clinical, food, or environmental isolates associated with foodborne outbreaks and recalls was used. Cultures were combined to target a final wet inoculum concentration of 4-5 log CFU/mL. The dry inoculum was prepared by mixing wet inoculum with 100 g of sterile sand and drying for 24 h. Produce investigated belonged to major commodity families: Ericaceae (blackberry, raspberry, and blueberry), Rutaceae (lemon and mandarin orange), Roseaceae (sweet cherry), Solanaceae (tomato), Brassaceae (cauliflower and broccoli) and Apiaceae (carrot). Intact, whole inoculated fruit and vegetable commodities were incubated at 2, 12, 22 and 35±2°C. Commodities were sampled for up to 28 days, and the experiment was replicated 6 times. The average maximum growth increase was obtained by measuring the maximum absolute increase for each replicate within a specific commodity, temperature, and inoculation method. Data for each commodity, replicate and temperature was used to create primary growth or survival models, describing the lag phase and growth or shoulder and decline as a function of time. Use of a liquid inoculum (vs. dry inoculum) resulted in markedly increased L. monocytogenes growth rate and growth magnitude on whole produce surfaces. This difference was highly influenced by temperature with a greater effect seen with more commodities at higher temperatures (22 and 35°C), versus lower temperatures (2 and 12 °C). These findings need to be explored for other commodities and pathogens. The degree to which wet or dry inoculation techniques more realistically mimic contamination conditions throughout the supply chain (e.g., production, harvest, post-harvest, transportation, or retail) should be investigated.

2007 ◽  
Vol 25 (18_suppl) ◽  
pp. 6001-6001 ◽  
Author(s):  
A. Chaturvedi ◽  
E. Engels ◽  
W. Anderson ◽  
M. Gillison

6001 Background: HNSCC are etiologically heterogeneous, with one subset primarily attributable to HPV and another to tobacco and alcohol. Methods: Data from SEER9 program registries were used to investigate the potential influence of HPV on incidence and survival of HNSCC in the US from 1973–2003. HNSCCs (N=58,158) were classified by anatomic site as potentially HPV-R (base of tongue; tonsil; oropharynx; N=16,712) or HPV-U (lip; tongue; gum; floor of mouth; palate; other mouth; hypopharynx; ill-defined sites of lip, oral cavity, and pharynx; N=41,446). Joinpoint regression was used to assess incidence trends and life-table methods were used to compare survival for HPV-R and HPV-U HNSCCs. Results: For HPV-R HNSCCs, age-adjusted incidence increased significantly from 1973–2003 (annual percent change [APC] = 0.65), particularly among males (APC=1.02), whites (APC=0.89), and younger ages (APCs for 30–39 = 1.46; 40- 49=1.92; 50–59=0.61, and =60= -0.66). By contrast, HPV-U HNSCC incidence was stable from 1973–1983 and then decreased significantly from 1983–2003 (APC= -2.42). Mean age at diagnosis was younger for HPV-R HNSCC than HPV-U (61.1 vs. 64.5 years; p<0.001), and from 1973- 2003, decreased significantly for HPV-R, but increased for HPV-U. Improvements in overall survival (OS) were observed for HPV-R (all stages) and HPV-U (regional and distant) HNSCC treated by radiotherapy (RT) from 1973–2003, but were more marked for HPV-R HNSCC, e.g. absolute increase in two-year OS for regional disease of 24.4% (vs. 5.8% for HPV-U). OS for HPV-R (local and regional) was significantly better than HPV-U HNSCC if treated by RT, but worse if not so treated. Conclusions: The proportion of HNSCC that is potentially HPV- R increased in the US from 1973–2003, particularly among recent birth cohorts, perhaps due to changing sexual and smoking behaviors. Recent improvements in locoregional control with RT-based therapy may be due in part to a gradual shift in the etiology of the underlying disease. No significant financial relationships to disclose.


1984 ◽  
Vol 62 (5) ◽  
pp. 1062-1068 ◽  
Author(s):  
M. Krol ◽  
M. Griffith ◽  
N. P. A. Huner

The accurate interpretation of physiological and biochemical alterations observed in plants grown under contrasting environmental conditions requires knowledge of their relative physiological ages. For this purpose, we compared the growth kinetics of winter rye (Secale cereale L. cv. Puma) at nonhardening and cold-hardening temperatures. Growth at nonhardening temperatures was characterized by a 10-day lag phase with the attainment of maximum growth after about 28 days. Growth at cold-hardening temperatures resulted in an extension of the lag phase to about 21 days with maximum growth being attained after 56 days. The calculated growth coefficient at cold-hardening temperatures was 35–40% of that at nonhardening temperatures. This relationship was consistent with growth parameters such as leaf dry weight, fresh weight, and area, but not with plant height. Although total leaf dry weight and total number of leaves per plant did not differ between nonhardened and cold-hardened plants at maximum growth, total leaf area per plant and stretched plant height was 3- to 4-times greater in nonhardened than in cold-hardened plants. This resulted in a fourfold increase in leaf dry weight per leaf area during growth at low temperature in contrast to the maintenance of a constant ratio during growth at nonhardening conditions. The increase in this ratio during low temperature growth was, in part, accounted for by a decrease in water content and an increase in cytoplasmic content. These results were confirmed by the investigation of growth on an individual leaf basis. However, the growth response of leaves 1 and 2 differed from that of leaves 3 and 4 when the leaf dry weight: leaf area ratio was measured as a function of time at cold-hardening temperatures. This indicates that the stage of leaf development influences its growth response to an altered environment. The results of the development of leaf freezing tolerance indicated that maximum vegetative growth appeared to coincide with maximum freezing tolerance of leaves from cold-hardened plants (−22 °C) but not of leaves from unhardened plants (−11 °C).


2021 ◽  
Author(s):  
Rebecca Chung

Globalization has enabled the year-round availability of imported fresh produce in Toronto, supplementing the variety of locally grown produce in Ontario. Increased consumption of produce has led to more foodborne outbreaks, with E. coli O157:H7 as the second most frequent cause of illnesses. In this study, the levels of heterotrophic bacteria, coliforms, and generic E. coli were compared between three types of imported and local produce. Significantly higher levels (p<0.04) of heterotrophic bacteria were found in imported basil. Local romaine (p<0.01) and local spinach (p<0.001) contained significantly higher levels of coliforms. Local spinach also had a significantly higher (p<0.005) number of samples with coliform levels above 100 CFU/g. Although no statistical significance was found between the presence of E. coli and origin of produce, the five imported samples positive for E. coli compared to zero local samples supports the hypothesis that imported produce is more susceptible to microbial contamination.


2019 ◽  
Vol 19 (19) ◽  
pp. 12587-12605 ◽  
Author(s):  
David D. Parrish ◽  
Christine A. Ennis

Abstract. US ambient ozone concentrations have two components: US background ozone and enhancements produced from the country's anthropogenic precursor emissions. Only the enhancements effectively respond to national emission controls. We investigate the temporal evolution and spatial variability in the largest ozone concentrations, i.e., those that define the ozone design value (ODV) upon which the National Ambient Air Quality Standard (NAAQS) is based, within the northern tier of US states. We focus on two regions: rural western states, with only small anthropogenic precursor emissions, and the urbanized northeastern states, which include the New York City urban area, the nation's most populated. The US background ODV (i.e., the ODV remaining if US anthropogenic precursor emissions were reduced to zero) is estimated to vary from 54 to 63 ppb in the rural western states and to be smaller and nearly constant (45.8±3.0 ppb) throughout the northeastern states. These US background ODVs correspond to 65 % to 90 % of the 2015 NAAQS of 70 ppb. Over the past 2 to 3 decades US emission control efforts have decreased the US anthropogenic ODV enhancements at an approximately exponential rate, with an e-folding time constant of ∼22 years. These ODV enhancements are relatively large in the northeastern US, with state maximum ODV enhancements of ∼35–64 ppb in 2000, but are not discernible in the rural western states. The US background ODV contribution is significantly larger than the present-day ODV enhancements due to photochemical production from US anthropogenic precursor emissions in the urban as well as the rural regions investigated. Forward projections of past trends suggest that average maximum ODVs in northeastern US will drop below the NAAQS of 70 ppb by about 2021, assuming that the exponential decrease in the ODV enhancements can be maintained and the US background ODV remains constant. This estimate is much more optimistic than in the Los Angeles urban area, where a similar approach estimates the maximum ODV to reach 70 ppb in ∼2050 (Parrish et al., 2017a). The primary reason for this large difference is the significantly higher US ODV background (62.0±2.0 ppb) estimated for the Los Angeles urban area. The approach used in this work has some unquantified uncertainties that are discussed. Models can also estimate US background ODVs; some of those results are shown to correlate with the observationally based estimates derived here (r2 values for different models are ∼0.31 to 0.90), but they are on average systematically lower by 4 to 13 ppb. Further model improvement is required until their output can accurately reproduce the time series and spatial variability in observed ODVs. Ideally, the uncertainties in the model and observationally based approaches can then be reduced through additional comparisons.


2008 ◽  
Vol 71 (1) ◽  
pp. 200-204 ◽  
Author(s):  
CINDY LOUI ◽  
GRIGOR GRIGORYAN ◽  
HAOHAO HUANG ◽  
LEE W. RILEY ◽  
SANGWEI LU

Fresh produce, including salad, is increasingly implicated in foodborne outbreaks. Although studies have been carried out to detect specific human pathogens from fresh produce, the total bacterial community associated with fresh produce is poorly understood. In this study, we characterized the bacterial community associated with alfalfa sprouts, using a culture-independent method. Four retail-purchased alfalfa sprout samples were obtained from different producers, and the bacterial community associated with each sample was determined by 16S rDNA profiling. Our results indicate that alfalfa sprouts sampled in our study shared significant similarities in their bacterial communities. Proteobacteria was the dominant phylum detected from all alfalfa sprout samples, with Enterobacteriaceae, Oxalobacteraceae, Moraxellaceae, and Sphingomonadaceae as the most frequently detected families. These results indicate that growth conditions of alfalfa sprouts should be taken into consideration to prevent the proliferation of pathogenic proteobacteria such as Escherichia coli O157 and Salmonella.


Foods ◽  
2020 ◽  
Vol 9 (11) ◽  
pp. 1703
Author(s):  
Agni Hadjilouka ◽  
Dimitris Tsaltas

Cyclospora cayetanensis is a coccidian protozoan that causes cyclosporiasis, a severe gastroenteric disease, especially for immunocompromised patients, children, and the elderly. The parasite is considered as an emerging organism and a major contributor of gastroenteritis worldwide. Although the global prevalence of cyclosporiasis morbidity and mortality has not been assessed, global concern has arisen since diarrheal illness and gastroenteritis significantly affect both developing countries and industrialized nations. In the last two decades, an increasing number of foodborne outbreaks has been associated with the consumption of fresh produce that is difficult to clean thoroughly and is consumed without processing. Investigations of these outbreaks have revealed the necessity to increase the awareness in clinicians of this infection, since this protozoan is often ignored by surveillance systems, and to establish control measures to reduce contamination of fresh produce. In this review, the major cyclosporiasis outbreaks linked to the consumption of ready to eat fresh fruits and vegetables are presented.


2008 ◽  
Vol 71 (9) ◽  
pp. 1806-1816 ◽  
Author(s):  
AMIT PAL ◽  
THEODORE P. LABUZA ◽  
FRANCISCO DIEZ-GONZALEZ

This research was conducted to study the growth of Listeria monocytogenes inoculated on frankfurters stored at different conditions as a basis for a safety-based consume by shelf life date label. Three L. monocytogenes strains were separately inoculated at 10 to 20 CFU/cm2 onto frankfurters that were previously formulated with or without high pressure and with or without added 2% potassium lactate (PL) and 0.2% sodium diacetate (SD). Inoculated frankfurters were air or vacuum packaged; stored at 4, 8, or 12°C; and L. monocytogenes and psychrotrophic plate counts were determined for 90, 60, and 45 days, respectively, or until the stationary phase was reached. The data (log CFU per square centimeter versus time) were fitted using the Baranyi-Roberts model to determine maximum growth rates and lag-phase time. The maximum growth rates and the lag time under each growth condition were used to calculate the time to reach 100-fold the initial Listeria population. In frankfurters lacking PL and SD, the count of all strains increased by 2 log after 18 to 50 days at 4°C and 4 to 13 days at 8°C. The growth was inhibited at 4 and 8°C in frankfurters containing PL and SD, but one ribotype was capable of growing, with the time to reach 100-fold the initial Listeria population ranging from 19 to 35 days at 12°C. In most cases, the time to reach 100-fold the initial Listeria population of L. monocytogenes was significantly longer in vacuum-packaged frankfurters as compared with air-packaged samples. Inclusion of PL and SD also inhibited the growth of psychrotrophs, but at all temperatures the psychrotrophic plate counts were greater than 4 log CFU/cm2 at the end of the experiments. These results indicated that despite the use of antimicrobials, certain L. monocytogenes strains could be capable of growing under storage-abuse conditions. Growth kinetics data could be useful for establishing a shelf life date label protocol under different handling scenarios.


Sign in / Sign up

Export Citation Format

Share Document