scholarly journals Cultural, Chemical, and Alternative Control Strategies for Rhizopus Soft Rot of Sweetpotato

Plant Disease ◽  
2016 ◽  
Vol 100 (8) ◽  
pp. 1532-1540 ◽  
Author(s):  
A. C. Scruggs ◽  
L. M. Quesada-Ocampo

Rhizopus soft rot, caused primarily by Rhizopus stolonifer, is one of the most common postharvest diseases of sweetpotato and is often considered the most devastating. Traditionally, Rhizopus soft rot has been effectively controlled using postharvest dips in dicloran fungicides; however, due to changes in market preferences, use of these fungicides is now limited. This, along with the lack of labeled and effective fungicides for control of Rhizopus soft rot in sweetpotato, creates the need for integrated strategies to control the disease. The effects of storage temperature (13, 23, and 29°C), relative humidity (80, 90, and 100%), and initial inoculum levels (3-, 5-, and 7-mm-diameter mycelial plugs) on progression of Rhizopus soft rot in ‘Covington’ sweetpotato were examined. Percent decay due to Rhizopus soft rot infection was significantly reduced (P < 0.0001) at a low temperature (13°C) but was not significantly affected by changes in relative humidity or initial inoculum level (P >0.05). Sporulation of R. stolonifer was also significantly reduced at the lowest temperature of 13°C. High relative humidity (>95%) significantly increased sporulation of R. stolonifer and sporulation also increased as initial inoculum level increased. Efficacy of chlorine dioxide (ClO2) fumigation, UV-C irradiation, and postharvest dips in alternative control products were also investigated for control of Rhizopus soft rot. Static ClO2 treatments were effective in reducing sporulation on treated roots but had no significant impact on incidence of Rhizopus soft rot. UV irradiation at 3.24 KJ/m2 1 h after inoculation as well as dips in aqueous ClO2 and StorOx 2.0 significantly (P < 0.05) reduced disease incidence. Understanding the epidemiological factors favoring Rhizopus soft rot and identifying alternative control strategies allow for improved recommendations to limit postharvest losses in sweetpotato.

2016 ◽  
Vol 106 (8) ◽  
pp. 909-919 ◽  
Author(s):  
A. C. Scruggs ◽  
L. M. Quesada-Ocampo

Sweetpotato production in the United States is limited by several postharvest diseases, and one of the most common is Fusarium root rot. Although Fusarium solani is believed to be the primary causal agent of disease, numerous other Fusarium spp. have been reported to infect sweetpotato. However, the diversity of Fusarium spp. infecting sweetpotato in North Carolina is unknown. In addition, the lack of labeled and effective fungicides for control of Fusarium root rot in sweetpotato creates the need for integrated strategies to control disease. Nonetheless, epidemiological factors that promote Fusarium root rot in sweetpotato remain unexplored. A survey of Fusarium spp. infecting sweetpotato in North Carolina identified six species contributing to disease, with F. solani as the primary causal agent. The effects of storage temperature (13, 18, 23, 29, and 35°C), relative humidity (80, 90, and 100%), and initial inoculum level (3-, 5-, and 7-mm-diameter mycelia plug) were examined for progression of Fusarium root rot caused by F. solani and F. proliferatum on ‘Covington’ sweetpotato. Fusarium root rot was significantly reduced (P < 0.05) at lower temperatures (13°C), low relative humidity levels (80%), and low initial inoculum levels for both pathogens. Sporulation of F. proliferatum was also reduced under the same conditions. Qualitative mycotoxin analysis of roots infected with one of five Fusarium spp. revealed the production of fumonisin B1 by F. proliferatum when infecting sweetpotato. This study is a step toward characterizing the etiology and epidemiology of Fusarium root rot in sweetpotato, which allows for improved disease management recommendations to limit postharvest losses to this disease.


2009 ◽  
Vol 72 (9) ◽  
pp. 1878-1884 ◽  
Author(s):  
AMIT PAL ◽  
THEODORE P. LABUZA ◽  
FRANCISCO DIEZ-GONZALEZ

The growth of Listeria monocytogenes inoculated on frankfurters at four inoculum levels (0.1, 0.04, 0.01, and 0.007 CFU/g) was examined at 4, 8, and 12°C until the time L. monocytogenes populations reached a detectable limit of at least 2 CFU/g. A scaled-down assumption was made to simulate a 25-g sample from a 100-lb batch size in a factory setting by using a 0.55-g sample from a 1,000-g batch size in a laboratory. Samples of 0.55 g were enriched in PDX-LIB selective medium, and presumptive results were confirmed on modified Oxford agar. Based on the time to detect (TTD) from each inoculum level and at each temperature, a shelf life model was constructed to predict the detection or risk levels reached by L. monocytogenes on frankfurters. The TTD increased with reductions in inoculum size and storage temperature. At 4°C the TTDs (±standard error) observed were 42.0 ± 1.0, 43.5 ± 0.5, 50.7 ± 1.5, and 55.0 ± 3.0 days when the inoculum sizes were 0.1, 0.04, 0.01, and 0.007 CFU/g, respectively. From the same corresponding inoculum sizes, the TTDs at 8°C were 4.5 ± 0.5, 6.5 ± 0.5, 7.0 ± 1.0, and 8.5 ± 0.5 days. Significant differences (P &lt; 0.05) between TTDs were observed only when the inoculum sizes differed by at least 2 log. On a shelf life plot of ln(TTD) versus temperature, the Q10 (increase in TTD for a 10°C increase in temperature) values ranged from 24.5 to 44.7 and with no significant influence from the inoculum densities. When the observed TTDs were compared with the expected detection times based on the data obtained from a study with an inoculum size of 10 to 20 CFU/g, significant deviations were noted at lower inoculum levels. These results can be valuable in designing a safety-based shelf life model for frankfurters and in performing quantitative risk assessment of listeriosis at low and practical contamination levels.


Plant Disease ◽  
2003 ◽  
Vol 87 (9) ◽  
pp. 1139-1143 ◽  
Author(s):  
R. Montes-Belmont ◽  
I. Méndez-Ramírez ◽  
H. E. Flores-Moctezuma ◽  
R. A. Nava-Juárez

It is difficult to develop control strategies for grain mold of sorghum because of the limited information on the epidemiology of grain mold in Mexico. The objectives of this study were to identify the fungi associated with grain mold in Morelos, Mexico, and to explore the relationship among planting dates, disease development, and relative humidity and temperature. Fusarium thapsinum was isolated from 97% of the grains from field samples of infested sorghum grains in Morelos, Mexico. The influence of planting dates on the development of sorghum grain mold was determined at Tlayca, Morelos, Mexico, during the rainy seasons of 1998, 1999, and 2000. Incidence of grain mold varied annually, but disease incidence and severity were highest in 1998. Planting dates from 1 June to 13 July had the highest incidence of grain mold during the 3 years. Throughout the study, disease severity was generally low, and yield was not affected. The late planting dates in 1999 and 2000 had reduced yields due to terminal drought of the crop. Increase of disease was predicted by mean temperature, but not by mean relative humidity.


2019 ◽  
Vol 49 (8) ◽  
Author(s):  
Ivan Herman Fischer ◽  
Matheus Froes de Moraes ◽  
Ana Carolina Firmino ◽  
Lilian Amorim

ABSTRACT: One of the major problems in the commercialization of avocados is the incidence of postharvest diseases, especially anthracnose (Colletotrichum spp.) and stem-end rot (Lasiodiplodia theobromae, Fusicoccum aesculi and Neofusicoccum spp.). As there is a lack of epidemiological information on these pathosystems, the objective of this study was to establish a method to detect quiescent infections and characterize their temporal progression and spatial pattern in a commercial orchard. Detection of quiescent infections was evaluated in flowers and fruits that were immature and in commercial harvest stage, treated with paraquat, ethrel or water. Treatment of flowers and immature fruits with paraquat led to rapid detection of Colletotrichum spp. In two seasons of a ‘Hass’ avocado orchard, the incidence of diseases was evaluated from open flowers to fruit harvest, totaling 11 evaluations at biweekly intervals. When fruits reached the harvest stage, the spatial distribution of diseased fruits in the trees was evaluated by means of dispersion index and modified Taylor’s law. Considering the evaluation of temporal disease progression, anthracnose was the most important disease, presenting a high initial incidence of 60 and 86% diseased flowers in the two seasons, respectively, while fruits showed an average disease incidence of 70 and 87%, respectively. Stem-end rot was observed only in fruits since the beginning of their development and presented low incidence (<8% fruits), significantly inferior to that of anthracnose. The diseases showed random dispersion within the trees, indicating that their initial inoculum is evenly distributed in the plants.


2009 ◽  
Vol 75 (23) ◽  
pp. 7409-7416 ◽  
Author(s):  
Ana Cláudia N. F. Spinelli ◽  
Anderson S. Sant'Ana ◽  
Salatir Rodrigues-Junior ◽  
Pilar R. Massaguer

ABSTRACT The prevention of spoilage by Alicyclobacillus acidoterrestris is a current challenge for fruit juice and beverage industries worldwide due to the bacterium's acidothermophilic growth capability, heat resistance, and spoilage potential. This study examined the effect of storage temperature on A. acidoterrestris growth in hot-filled orange juice. The evolution of the A. acidoterrestris population was monitored under six different storage conditions after pasteurization (at 92°C for 10 s), maintenance at 85°C for 150 s, and cooling with water spray to 35°C in about 30 min and using two inoculum levels: <101 and 101 spores/ml. Final cooling and storage conditions were as follows: treatment 1, 30°C for the bottle cold point and storage at 35°C; treatment 2, 30°C for 48 h and storage at 35°C; treatment 3, 25°C for the bottle cold point and storage at 35°C; treatment 4, 25°C for 48 h and storage at 35°C; treatment 5, storage at 20°C (control); and treatment 6, filling and storage at 25°C. It was found that only in treatment 5 did the population remain inhibited during the 6 months of orange juice shelf life. By examining treatments 1 to 4, it was observed that A. acidoterrestris predicted growth parameters were significantly influenced (P < 0.05) either by inoculum level or cooling and storage conditions. The time required to reach a 104 CFU/ml population of A. acidoterrestris was considered to be an adequate parameter to indicate orange juice spoilage by A. acidoterrestris. Therefore, hot-filled orange juice should be stored at or below 20°C to avoid spoilage by this microorganism. This procedure can be considered a safe and inexpensive alternative to other treatments proposed earlier.


2005 ◽  
Vol 95 (12) ◽  
pp. 1462-1471 ◽  
Author(s):  
D. W. Cullen ◽  
I. K. Toth ◽  
Y. Pitkin ◽  
N. Boonham ◽  
K. Walsh ◽  
...  

Specific and sensitive quantitative diagnostics, based on real-time (TaqMan) polymerase chain reaction (PCR) and PCR enzyme-linked immunosorbent assay, were developed to detect dry-rot-causing Fusarium spp. (F. avenaceum, F. coeruleum, F. culmorum, and F. sulphureum). Each assay detected Fusarium spp. on potato seed stocks with equal efficiency. Four potato stocks, sampled over two seed generations from Scottish stores, were contaminated with F. avenaceum, F. sulphureum, F. culmorum, F. coeruleum or a combination of species, and there was a general trend towards increased Fusarium spp. contamination in the second generation of seed sampled. F. sulphureum and F. coeruleum caused significantly (P < 0.05) more disease in storage than the other species when disease-free tubers of potato cvs. Spunta and Morene were inoculated at a range of inoculum concentrations (0, 104, 105, and 106 conidia/ml). Increased DNA levels were correlated with increased disease severity between 8 and 12 weeks of storage. The threshold inoculum levels resulting in significant disease development on both cultivars were estimated to be 104 conidia/ml for F. sulphureum and 105 conidia/ml for F. coeruleum. To study the effect of soil infestation and harvest date on disease incidence, seed tubers of cvs. Morene and Spunta were planted in a field plot artificially infested with the four Fusarium spp. F. culmorum and F. sulphureum were detected in soil taken from these plots at harvest, and F. sulphureum DNA levels increased significantly (P < 0.05) at the final harvest. All four Fusarium spp. were detected in progeny tubers. There was a trend toward higher levels of F. culmorum detected in progeny tubers at the earliest harvest date, and higher levels of F. sulphureum at the final harvest. The use of diagnostic assays to detect fungal storage rot pathogens and implications for disease control strategies are discussed.


2021 ◽  
Vol 13 (2) ◽  
pp. 7-11
Author(s):  
Salma Kassebi ◽  
Péter Korzenszky

Apples, like other fruits, are exposed to stress during their growth and development in the field, also during harvest and the postharvest environment (processing, storage, and transportation). The refrigeration system allows for bulk handling of food products from harvest to market, ensuring that food products are maintained in their freshness and integrity for an extended period through careful management of storage temperature and humidity. This study investigated the effects of storage on the weight loss of apples (Golden Delicious fruits harvested at maturity), under refrigerated conditions at a temperature of 5±0.5°C and relative humidity of 82% and under ambient storage at a temperature of 25 ±0.5 °C and relative humidity of 60 %, over 3 months. The findings revealed that the two groups of apples experienced weight reduction at different levels. Apples placed at cold storage presented a loss of weight between 3.31g and 4.49g; however, apples stored at ambient temperature showed a significant loss of weight between 21.9g and 31.76g.


2003 ◽  
Vol 66 (12) ◽  
pp. 2231-2236 ◽  
Author(s):  
CHRISTINA M. MOORE ◽  
BRIAN W. SHELDON ◽  
LEE-ANN JAYKUS

The degree of transfer of Campylobacter jejuni and Salmonella enterica serovar Typhimurium was evaluated from a stainless steel contact surface to a ready-to-eat food (lettuce). Stainless steel coupons (25 cm2) were inoculated with a 20-μl drop of either C. jejuni or Salmonella Typhimurium to provide an inoculum level of ~106 CFU/28 mm2. Wet and dry lettuce (Lactuca sativa var. longifolia) pieces (9 cm2) were placed onto the inoculated stainless steel surface for 10 s after the designated inoculum drying time (0 to 80 min for C. jejuni; 0 to 120 min for Salmonella Typhimurium), which was followed by the recovery and enumeration of transferred pathogens (lettuce) and residual surface pathogens (stainless steel coupons). For transfers of Salmonella Typhimurium to dry lettuce, there was an increase from 36 to 66% in the percent transfer of the initial inoculum load during the first 60 min of sampling and then a precipitous drop from 66 to 6% in percent transfer. The transfer of Salmonella Typhimurium to wet lettuce ranged from 23 to 31%, with no statistically significant difference between recoveries over the entire 120-min sampling period. For C. jejuni, the mean percent transfer ranged from 16 to 38% for dry lettuce and from 15 to 27% for wet lettuce during the 80-min sampling period. The results of this study indicate that relatively high numbers of bacteria may be transferred to a food even 1 to 2 h after surface contamination. These findings can be used to support future projects aimed at estimating the degree of risk associated with poor handling practices of ready-to-eat foods.


2010 ◽  
Vol 73 (4) ◽  
pp. 620-630 ◽  
Author(s):  
ABANI K. PRADHAN ◽  
RENATA IVANEK ◽  
YRJÖ T. GRÖHN ◽  
ROBERT BUKOWSKI ◽  
IFIGENIA GEORNARAS ◽  
...  

The objective of this study was to estimate the relative risk of listeriosis-associated deaths attributable to Listeria monocytogenes contamination in ham and turkey formulated without and with growth inhibitors (GIs). Two contamination scenarios were investigated: (i) prepackaged deli meats with contamination originating solely from manufacture at a frequency of 0.4% (based on reported data) and (ii) retail-sliced deli meats with contamination originating solely from retail at a frequency of 2.3% (based on reported data). Using a manufacture-to-consumption risk assessment with product-specific growth kinetic parameters (i.e., lag phase and exponential growth rate), reformulation with GIs was estimated to reduce human listeriosis deaths linked to ham and turkey by 2.8- and 9-fold, respectively, when contamination originated at manufacture and by 1.9- and 2.8-fold, respectively, for products contaminated at retail. Contamination originating at retail was estimated to account for 76 and 63% of listeriosis deaths caused by ham and turkey, respectively, when all products were formulated without GIs and for 83 and 84% of listeriosis deaths caused by ham and turkey, respectively, when all products were formulated with GIs. Sensitivity analyses indicated that storage temperature was the most important factor affecting the estimation of per annum relative risk. Scenario analyses suggested that reducing storage temperature in home refrigerators to consistently below 7°C would greatly reduce the risk of human listeriosis deaths, whereas reducing storage time appeared to be less effective. Overall, our data indicate a critical need for further development and implementation of effective control strategies to reduce L. monocytogenes contamination at the retail level.


2021 ◽  
Author(s):  
Renata Lebecka ◽  
Jadwiga Śliwka ◽  
Anna Grupa-Urbańska ◽  
Katarzyna Szajko ◽  
Waldemar Marczewski

AbstractSoft rot is a bacterial disease that causes heavy losses in potato production worldwide. The goal of this study was to identify quantitative trait loci (QTLs) for potato tuber resistance to bacteriumDickeya solaniand for tuber starch content to study the relationship between these traits. A highly resistant diploid hybrid of potato was crossed with a susceptible hybrid to generate the F1 mapping population. Tubers that were wound-inoculated with bacteria were evaluated for disease severity expressed as the mean weight of rotted tubers, and disease incidence measured as the proportion of rotten tubers. Diversity array technology (DArTseq™) was used for genetic map construction and QTLs analysis. The most prominent QTLs for disease severity and incidence were identified in overlapping regions on potato chromosome IV and explained 22.4% and 22.9% of the phenotypic variance, respectively. The second QTL for disease severity was mapped to chromosome II and explained 16.5% of the variance. QTLs for starch content were detected on chromosomes III, V, VI, VII, VIII, IX, XI, and XII in regions different than the QTLs for soft rot resistance. Two strong and reproducible QTLs for resistance toDickeya solanion potato chromosomes IV and II might be useful for further study of candidate genes and marker development in potato breeding programs. The relationship between tuber resistance to bacteria and the starch content in potato tubers was not confirmed by QTL mapping, which makes the selection of genotypes highly resistant to soft rot with a desirable starch content feasible.


Sign in / Sign up

Export Citation Format

Share Document