Quantitative Risk Assessment of Listeriosis-Associated Deaths Due to Listeria monocytogenes Contamination of Deli Meats Originating from Manufacture and Retail

2010 ◽  
Vol 73 (4) ◽  
pp. 620-630 ◽  
Author(s):  
ABANI K. PRADHAN ◽  
RENATA IVANEK ◽  
YRJÖ T. GRÖHN ◽  
ROBERT BUKOWSKI ◽  
IFIGENIA GEORNARAS ◽  
...  

The objective of this study was to estimate the relative risk of listeriosis-associated deaths attributable to Listeria monocytogenes contamination in ham and turkey formulated without and with growth inhibitors (GIs). Two contamination scenarios were investigated: (i) prepackaged deli meats with contamination originating solely from manufacture at a frequency of 0.4% (based on reported data) and (ii) retail-sliced deli meats with contamination originating solely from retail at a frequency of 2.3% (based on reported data). Using a manufacture-to-consumption risk assessment with product-specific growth kinetic parameters (i.e., lag phase and exponential growth rate), reformulation with GIs was estimated to reduce human listeriosis deaths linked to ham and turkey by 2.8- and 9-fold, respectively, when contamination originated at manufacture and by 1.9- and 2.8-fold, respectively, for products contaminated at retail. Contamination originating at retail was estimated to account for 76 and 63% of listeriosis deaths caused by ham and turkey, respectively, when all products were formulated without GIs and for 83 and 84% of listeriosis deaths caused by ham and turkey, respectively, when all products were formulated with GIs. Sensitivity analyses indicated that storage temperature was the most important factor affecting the estimation of per annum relative risk. Scenario analyses suggested that reducing storage temperature in home refrigerators to consistently below 7°C would greatly reduce the risk of human listeriosis deaths, whereas reducing storage time appeared to be less effective. Overall, our data indicate a critical need for further development and implementation of effective control strategies to reduce L. monocytogenes contamination at the retail level.

2009 ◽  
Vol 72 (5) ◽  
pp. 978-989 ◽  
Author(s):  
ABANI K. PRADHAN ◽  
RENATA IVANEK ◽  
YRJÖ T. GRÖHN ◽  
IFIGENIA GEORNARAS ◽  
JOHN N. SOFOS ◽  
...  

Foodborne disease associated with consumption of ready-to-eat foods contaminated with Listeria monocytogenes represents a considerable pubic health concern. In a risk assessment published in 2003, the U.S. Food and Drug Administration and the U.S. Food Safety and Inspection Service estimated that about 90% of human listeriosis cases in the United States are caused by consumption of contaminated deli meats. In this risk assessment, all deli meats were grouped into one of 23 categories of ready-to-eat foods, and only the postretail growth of L. monocytogenes was considered. To provide an improved risk assessment for L. monocytogenes in deli meats, we developed a revised risk assessment that (i) models risk for three subcategories of deli meats (i.e., ham, turkey, and roast beef) and (ii) models L. monocytogenes contamination and growth from production to consumption while considering subcategory-specific growth kinetics parameters (i.e., lag phase and exponential growth rate). This model also was used to assess how reformulation of the chosen deli meat subcategories with L. monocytogenes growth inhibitors (i.e., lactate and diacetate) would impact the number of human listeriosis cases. Use of product-specific growth parameters demonstrated how certain deli meat categories differ in the relative risk of causing listeriosis; products that support more rapid growth and have reduced lag phases (e.g., turkey) represent a higher risk. Although reformulation of deli meats with growth inhibitors was estimated to reduce by about 2.5- to 7.8-fold the number of human listeriosis cases linked to a given deli meat subcategory and thus would reduce the overall risk of human listeriosis, even with reformulation deli meats would still cause a considerable number of human listeriosis cases. A combination of strategies is thus needed to provide continued reduction of these cases. Risk assessment models such as that described here will be critical for evaluation of different control approaches and to help define the combinations of control strategies that will have the greatest imp–ct on public health.


2016 ◽  
Vol 79 (7) ◽  
pp. 1076-1088 ◽  
Author(s):  
DANIEL GALLAGHER ◽  
RÉGIS POUILLOT ◽  
KARIN HOELZER ◽  
JIA TANG ◽  
SHERRI B. DENNIS ◽  
...  

ABSTRACT Cross-contamination, improper holding temperatures, and insufficient sanitary practices are known retail practices that may lead to product contamination and growth of Listeria monocytogenes. However, the relative importance of control options to mitigate the risk of invasive listeriosis from ready-to-eat (RTE) products sliced or prepared at retail is not well understood. This study illustrates the utility of a quantitative risk assessment model described in a first article of this series (Pouillot, R., D. Gallagher, J. Tang, K. Hoelzer, J. Kause, and S. B. Dennis, J. Food Prot. 78:134–145, 2015) to evaluate the public health impact associated with changes in retail deli practices and interventions. Twenty-two mitigation scenarios were modeled and evaluated under six different baseline conditions. These scenarios were related to sanitation, worker behavior, use of growth inhibitors, cross-contamination, storage temperature control, and reduction of the level of L. monocytogenes on incoming RTE food products. The mean risk per serving of RTE products obtained under these scenarios was then compared with the risk estimated in the baseline condition. Some risk mitigations had a consistent impact on the predicted listeriosis risk in all baseline conditions (e.g. presence or absence of growth inhibitor), whereas others were greatly dependent on the initial baseline conditions or practices in the deli (e.g. preslicing of products). Overall, the control of the bacterial growth and the control of contamination at its source were major factors of listeriosis risk in these settings. Although control of cross-contamination and continued sanitation were also important, the decrease in the predicted risk was not amenable to a simple solution. Findings from these predictive scenario analyses are intended to encourage improvements to retail food safety practices and mitigation strategies to control L. monocytogenes in RTE foods more effectively and to demonstrate the utility of quantitative risk assessment models to inform risk management decisions.


2021 ◽  
Vol 36 (2) ◽  
pp. 124-134
Author(s):  
Ki Young Song ◽  
◽  
So Young Yang ◽  
Eun Woo Lee ◽  
Ki Sun Yoon

2017 ◽  
Vol 80 (3) ◽  
pp. 447-453 ◽  
Author(s):  
Ai Kataoka ◽  
Hua Wang ◽  
Philip H. Elliott ◽  
Richard C. Whiting ◽  
Melinda M. Hayman

ABSTRACT The growth characteristics of Listeria monocytogenes inoculated onto frozen foods (corn, green peas, crabmeat, and shrimp) and thawed by being stored at 4, 8, 12, and 20°C were investigated. The growth parameters, lag-phase duration (LPD) and exponential growth rate (EGR), were determined by using a two-phase linear growth model as a primary model and a square root model for EGR and a quadratic model for LPD as secondary models, based on the growth data. The EGR model predictions were compared with growth rates obtained from the USDA Pathogen Modeling Program, calculated with similar pH, salt percentage, and NaNO2 parameters, at all storage temperatures. The results showed that L. monocytogenes grew well in all food types, with the growth rate increasing with storage temperature. Predicted EGRs for all food types demonstrated the significance of storage temperature and similar growth rates among four food types. The predicted EGRs showed slightly slower rate compared with the values from the U.S. Department of Agriculture Pathogen Modeling Program. LPD could not be accurately predicted, possibly because there were not enough sampling points. These data established by using real food samples demonstrated that L. monocytogenes can initiate growth without a prolonged lag phase even at refrigeration temperature (4°C), and the predictive models derived from this study can be useful for developing proper handling guidelines for thawed frozen foods during production and storage.


2020 ◽  
Vol 2020 ◽  
pp. 1-9
Author(s):  
Seoungsoon Yeo ◽  
Misook Kim

This study aimed to investigate the growth of Listeria monocytogenes in rice balls and to conduct its microbial risk assessment based on the Korean dietary pattern. Each tuna or ham rice ball was mixed with mayonnaise, soy sauce, or gochujang, a Korean traditional fermented red peeper paste, which was artificially contaminated with L. monocytogenes and then stored at 7°C–25°C to assess bacterial growth. Growth data were analyzed using three primary models (the Huang, Baranyi, and Gompertz models), and the growth pattern was found to fit well to the Baranyi model based on the following five statistical criteria: root mean square error (0.38–0.56), Akaike’s information criterion (−51.55–−26.99), coefficient of determination (0.72–0.97), bias factor (0.97–1.01), and accuracy factor (1.06–1.18). The effects of temperature on bacterial growth rate and lag time were evaluated using the square root model. The minimum growth temperature for L. monocytogenes in tuna or ham rice balls was the lowest when they were mixed with mayonnaise (−9.44°C or −15.37°C, respectively). Risk assessment using FDA-iRISK showed that tuna or ham rice balls mixed with gochujang exhibited the highest microbial risk among all the rice balls tested, regardless of the storage temperature. Tuna or ham rice balls mixed with gochujang had the highest disability-adjusted life years per year (0.015) followed by ham rice balls mixed with soy sauce (0.011–0.015) or mayonnaise (0.006–0.015) and then tuna rice balls mixed with soy sauce (0.006–0.008) or mayonnaise (<0.001). In conclusion, our results, determined using predictive growth models, allow the assessment of potential risk ranking associated with the consumption of rice balls contaminated with L. monocytogenes based on the number of illnesses experienced per serving and the disease burden.


2009 ◽  
Vol 131 (2-3) ◽  
pp. 128-137 ◽  
Author(s):  
Tom Ross ◽  
Sven Rasmussen ◽  
Aamir Fazil ◽  
Greg Paoli ◽  
John Sumner

2006 ◽  
Vol 69 (11) ◽  
pp. 2648-2663 ◽  
Author(s):  
ELEFTHERIOS H. DROSINOS ◽  
MARIOS MATARAGAS ◽  
SLAVICA VESKOVIĆ-MORAČANIN ◽  
JUDIT GASPARIK-REICHARDT ◽  
MIRZA HADŽIOSMANOVIĆ ◽  
...  

Listeria monocytogenes NCTC10527 was examined with respect to its nonthermal inactivation kinetics in fermented sausages from four European countries: Serbia-Montenegro, Hungary, Croatia, and Bosnia-Herzegovina. The goal was to quantify the effect of fermentation and ripening conditions on L. monocytogenes with the simultaneous presence or absence of bacteriocin-producing lactic acid bacteria (i.e., Lactobacillus sakei). Different models were used to fit the experimental data and to calculate the kinetic parameters. The best model was chosen based on statistical comparisons. The Baranyi model was selected because it fitted the data better in most (73%) of the cases. The results from the challenge experiments and the subsequent statistical analysis indicated that relative to the control condition the addition of L. sakei strains reduced the time required for a 4-log reduction of L. monocytogenes (t4D). In contrast, the addition of the bacteriocins mesenterocin Y and sakacin P decreased the t4D values for only the Serbian product. A case study for risk assessment also was conducted. The data of initial population and t4D collected from all countries were described by a single distribution function. Storage temperature, packaging method, pH, and water activity of the final products were used to calculate the inactivation of L. monocytogenes that might occur during storage of the final product (U.S. Department of Agriculture Pathogen Modeling Program version 7.0). Simulation results indicated that the addition of L. sakei strains significantly decreased the simulated L. monocytogenes concentration of ready-to-eat fermented sausages at the time of consumption.


Risk Analysis ◽  
2007 ◽  
Vol 27 (3) ◽  
pp. 683-700 ◽  
Author(s):  
Régis Pouillot ◽  
Nicolas Miconnet ◽  
Anne-Laure Afchain ◽  
Marie Laure Delignette-Muller ◽  
Annie Beaufort ◽  
...  

2021 ◽  
Vol 99 ◽  
pp. 103800
Author(s):  
Sofia Tsaloumi ◽  
Zafiro Aspridou ◽  
Eirini Tsigarida ◽  
Fragiskos Gaitis ◽  
Gorgias Garofalakis ◽  
...  

2009 ◽  
Vol 72 (9) ◽  
pp. 1878-1884 ◽  
Author(s):  
AMIT PAL ◽  
THEODORE P. LABUZA ◽  
FRANCISCO DIEZ-GONZALEZ

The growth of Listeria monocytogenes inoculated on frankfurters at four inoculum levels (0.1, 0.04, 0.01, and 0.007 CFU/g) was examined at 4, 8, and 12°C until the time L. monocytogenes populations reached a detectable limit of at least 2 CFU/g. A scaled-down assumption was made to simulate a 25-g sample from a 100-lb batch size in a factory setting by using a 0.55-g sample from a 1,000-g batch size in a laboratory. Samples of 0.55 g were enriched in PDX-LIB selective medium, and presumptive results were confirmed on modified Oxford agar. Based on the time to detect (TTD) from each inoculum level and at each temperature, a shelf life model was constructed to predict the detection or risk levels reached by L. monocytogenes on frankfurters. The TTD increased with reductions in inoculum size and storage temperature. At 4°C the TTDs (±standard error) observed were 42.0 ± 1.0, 43.5 ± 0.5, 50.7 ± 1.5, and 55.0 ± 3.0 days when the inoculum sizes were 0.1, 0.04, 0.01, and 0.007 CFU/g, respectively. From the same corresponding inoculum sizes, the TTDs at 8°C were 4.5 ± 0.5, 6.5 ± 0.5, 7.0 ± 1.0, and 8.5 ± 0.5 days. Significant differences (P &lt; 0.05) between TTDs were observed only when the inoculum sizes differed by at least 2 log. On a shelf life plot of ln(TTD) versus temperature, the Q10 (increase in TTD for a 10°C increase in temperature) values ranged from 24.5 to 44.7 and with no significant influence from the inoculum densities. When the observed TTDs were compared with the expected detection times based on the data obtained from a study with an inoculum size of 10 to 20 CFU/g, significant deviations were noted at lower inoculum levels. These results can be valuable in designing a safety-based shelf life model for frankfurters and in performing quantitative risk assessment of listeriosis at low and practical contamination levels.


Sign in / Sign up

Export Citation Format

Share Document