scholarly journals Water and Temperature Parameters Associated with Winter Wheat Diseases Caused by Soilborne Pathogens

Plant Disease ◽  
2009 ◽  
Vol 93 (1) ◽  
pp. 73-80 ◽  
Author(s):  
Richard W. Smiley

Wheat in eastern Oregon is produced mostly as a 2-year rotation of winter wheat and summer fallow. Maximum agronomic yield potential is expected with early September planting dates but actual yields are generally highest for plantings made in mid-October. Field experiments with sequential planting dates from early September to December were performed over 4 years. Associations among yield, disease incidence, and 19 moisture and temperature parameters were evaluated. Incidence of Cephalosporium stripe, crown rot, eyespot, and take-all decreased as planting was delayed. Crown rot and eyespot were negatively correlated more significantly and more frequently with temperature than moisture parameters, and take-all was more associated with moisture than temperature. Rhizoctonia root rot was unrelated to planting date and climatic parameters. Crown rot was identified most frequently (4 of 5 tests) as an important contributor to yield suppression but yield was most closely associated (R2 > 0.96) with effects from a single disease in only two of five location–year tests. Yield was most related to combinations of diseases in three of five tests, complicating development of disease modules for wheat growth-simulation models.

Plant Disease ◽  
2011 ◽  
Vol 95 (3) ◽  
pp. 263-268 ◽  
Author(s):  
S. K. Gremillion ◽  
A. K. Culbreath ◽  
D. W. Gorbet ◽  
B. G. Mullinix ◽  
R. N. Pittman ◽  
...  

Field experiments were conducted in 2002 to 2006 to characterize yield potential and disease resistance in the Bolivian landrace peanut (Arachis hypogaea) cv. Bayo Grande, and breeding lines developed from crosses of Bayo Grande and U.S. cv. Florida MDR-98. Diseases of interest included early leaf spot, caused by the fungus Cercospora arachidicola, and late leaf spot, caused by the fungus Cercosporidium personatum. Bayo Grande, MDR-98, and three breeding lines, along with U.S. cvs. C-99R and Georgia Green, were included in split-plot field experiments in six locations across the United States and Bolivia. Whole-plot treatments consisted of two tebuconazole applications and a nontreated control. Genotypes were the subplot treatments. Area under the disease progress curve (AUDPC) for percent defoliation due to leaf spot was lower for Bayo Grande and all breeding lines than for Georgia Green at all U.S. locations across years. AUDPC for disease incidence from one U.S. location indicated similar results. Severity of leaf spot epidemics and relative effects of the genotypes were less consistent in the Bolivian experiments. In Bolivia, there were no indications of greater levels of disease resistance in any of the breeding lines than in Bayo Grande. In the United States, yields of Bayo Grande and the breeding lines were greater than those of the other genotypes in 1 of 2 years. In Bolivia, low disease intensity resulted in the highest yields in Georgia Green, while high disease intensity resulted in comparable yields among the breeding lines, MDR-98, and C-99R. Leaf spot suppression by tebuconazole was greater in Bolivia than in the United States. This result indicates a possible higher level of fungicide resistance in the U.S. population of leaf spot pathogens. Overall, data from this study suggest that Bayo Grande and the breeding lines may be desirable germplasm for U.S. and Bolivian breeding programs or production.


2020 ◽  
Vol 21 (15) ◽  
pp. 5260 ◽  
Author(s):  
Samir Alahmad ◽  
Yichen Kang ◽  
Eric Dinglasan ◽  
Elisabetta Mazzucotelli ◽  
Kai P. Voss-Fels ◽  
...  

Durum wheat (Triticum turgidum L. ssp. durum) production can experience significant yield losses due to crown rot (CR) disease. Losses are usually exacerbated when disease infection coincides with terminal drought. Durum wheat is very susceptible to CR, and resistant germplasm is not currently available in elite breeding pools. We hypothesize that deploying physiological traits for drought adaptation, such as optimal root system architecture to reduce water stress, might minimize losses due to CR infection. This study evaluated a subset of lines from a nested association mapping population for stay-green traits, CR incidence and yield in field experiments as well as root traits under controlled conditions. Weekly measurements of normalized difference vegetative index (NDVI) in the field were used to model canopy senescence and to determine stay-green traits for each genotype. Genome-wide association studies using DArTseq molecular markers identified quantitative trait loci (QTLs) on chromosome 6B (qCR-6B) associated with CR tolerance and stay-green. We explored the value of qCR-6B and a major QTL for root angle QTL qSRA-6A using yield datasets from six rainfed environments, including two environments with high CR disease pressure. In the absence of CR, the favorable allele for qSRA-6A provided an average yield advantage of 0.57 t·ha−1, whereas in the presence of CR, the combination of favorable alleles for both qSRA-6A and qCR-6B resulted in a yield advantage of 0.90 t·ha−1. Results of this study highlight the value of combining above- and belowground physiological traits to enhance yield potential. We anticipate that these insights will assist breeders to design improved durum varieties that mitigate production losses due to water deficit and CR.


2018 ◽  
Author(s):  
A.A Adnan ◽  
J. Diels ◽  
J.M. Jibrin ◽  
A.Y. Kamara ◽  
P. Craufurd ◽  
...  

AbstractMost crop simulation models require the use of Genotype Specific Parameters (GSPs) which provide the Genotype component of G×E×M interactions. Estimation of GSPs is the most difficult aspect of most modelling exercises because it requires expensive and time-consuming field experiments. GSPs could also be estimated using multi-year and multi locational data from breeder evaluation experiments. This research was set up with the following objectives: i) to determine GSPs of 10 newly released maize varieties for the Nigerian Savannas using data from both calibration experiments and by using existing data from breeder varietal evaluation trials; ii) to compare the accuracy of the GSPs generated using experimental and breeder data; and iii) to evaluate CERES-Maize model to simulate grain and tissue nitrogen contents. For experimental evaluation, 8 different experiments were conducted during the rainy and dry seasons of 2016 across the Nigerian Savanna. Breeder evaluation data was also collected for 2 years and 7 locations. The calibrated GSPs were evaluated using data from a 4 year experiment conducted under varying nitrogen rates (0, 60 and 120kg N ha−1). For the model calibration using experimental data, calculated model efficiency (EF) values ranged between 0.86-0.92 and coefficient of determination (d-index) between 0.92-0.98. Calibration of time-series data produced nRMSE below 7% while all prediction deviations were below 10% of the mean. For breeder experiments, EF (0.52-0.81) and d-index (0.46-0.83) ranges were lower. Prediction deviations were below 17% of the means for all measured variables. Model evaluation using both experimental and breeder trials resulted in good agreement (low RMSE, high EF and d-index values) between observed and simulated grain yields, and tissue and grain nitrogen contents. We conclude that higher calibration accuracy of CERES-Maize model is achieved from detailed experiments. If unavailable, data from breeder experimental trials collected from many locations and planting dates can be used with lower but acceptable accuracy.


1995 ◽  
Vol 124 (2) ◽  
pp. 173-194 ◽  
Author(s):  
R. D. Prew ◽  
J. E. Ashby ◽  
E. T. G. Bacon ◽  
D. G. Christian ◽  
R. J. Gutteridge ◽  
...  

SUMMARYDisposal methods for straw from continuous winter wheat were tested on two soil types, a flinty silty clay loam and a sandy loam, over 7 years (1985–91). The methods tested were burnt or chopped straw in full factorial combination with four cultivation methods (tined to 10 cm, tined to 10 cm then to 20 cm; ploughed to 20 cm; tined to 10 cm then ploughed to 20 cm). Measurements were taken to determine the effects on crop establishment and growth, pest and disease incidence, and the consequent effects on yield. Another experiment (1985–91) on the flinty silty clay loam site, investigated the interactions between straw treatments (burnt, baled or chopped in plots that were all shallow cultivated to 10 cm) and five other factors; namely, time of cultivation, insecticides, molluscicides, fungicides and autumn nitrogen. All the straw x cultivation systems allowed satisfactory crops to be established but repeated incorporation of straw using shallow, non-inversion cultivations resulted in very severe grass-weed problems. Early crop growth, as measured by above-ground dry matter production, was frequently decreased by straw residues, but the effect rarely persisted beyond anthesis. Pests were not a problem and their numbers were not greatly affected either by straw or cultivation treatments, apart from yellow cereal fly which, especially on the heavier soil, was decreased by treatments which left much straw debris on the soil surface. Incorporating straw also caused no serious increases in the incidence of diseases. Indeed, averaged over all sites and years, eyespot and sharp eyespot were both slightly but significantly less severe where straw was incorporated than where it was burnt. Eyespot, and even more consistently sharp eyespot, were often more severe after ploughing than after shallow, non-inversion cultivations. Effects on take-all were complex but straw residues had much smaller effects than cultivations. Initially the disease increased most rapidly in the shallow cultivated plots but these also tended to go into the decline phase more quickly so that in the fourth year (fifth cereal crop) take-all was greater in the ploughed than in the shallow cultivated plots. On average, yields did not differ greatly with straw or cultivation systems, although there were clear effects of take-all in those years when the disease was most severe. In the last 2 years, yields were limited by the presence of grass weeds in the plots testing chopped straw incorporated by tining to 10 cm.


1998 ◽  
Vol 49 (8) ◽  
pp. 1225 ◽  
Author(s):  
P. A. Gardner ◽  
J. F. Angus ◽  
P. T. W. Wong ◽  
G. D. Pitson

Take-all is a root disease of wheat caused by the fungus Gaeumannomyces graminis var. tritici (Ggt). The most common method of control, growing wheat after a break crop, is not always feasible. This study compared the use of a break crop with 5 alternative control methods in a series of field experiments in south-eastern Australia. The methods of control tested were: (1) fungicide added to fertiliser; (2) soil fumigation with methyl bromide; (3) applied chloride; (4) seed treatment with microbial antagonists; (5) a prior brassica break crop; and (6) a 12-month-long fallow. Eight experiments were conducted over 2 years but not all treatments were included in each experiment. The most successful control methods were growing wheat after a brassica break crop or a long fallow. Both methods gave 72% yield increases over wheat growing after wheat. None of the other methods gave consistent, significant, or profitable yield increases or disease control. The mean yield increases in the year of application were 8% for the fungicide, 6% for microbial antagonists, 4% for chloride, and 7% for fumigation. The probable reason that fungicide and microbial antagonists were ineffective was that they were localised in the furrow where they were applied, whereas roots became infected in the inter-row space. Probable reasons that chloride was ineffective were that the background soil chloride levels were generally above the responsive range, and that roots became infected with take-all after the chloride was leached from the topsoil. The limitation of fumigation was that it suppressed natural antagonists of the Ggt, apparently leading to reinfection at higher levels than before. There was also evidence of Ggt re-infection in the second year after break crops, leading to an apparent ‘boomerang’ effect. Take-all inocula at the sites were measured in pre-sowing soil bioassays, whereas disease incidence was determined in seedlings and as ‘whiteheads’ as crops approached maturity. The only consistent pattern among the measurements was low disease incidence after break crops and the long fallow. Otherwise, there were low correlations between the 3 sets of measurements, suggesting that environmental changes after the soil bioassay and seedling assessment played critical roles in the progress of the disease.


1957 ◽  
Vol 48 (3) ◽  
pp. 326-335 ◽  
Author(s):  
G. A. Salt

A field experiment to test effects of cultural treatments on eyespot (Cercosporella herpotrichoides Fron.), lodging and yield of winter wheat, begun in 1952 (Salt, 1955), was continued on the same site in 1953. In 1952 only eyespot and lodging were severe, but in 1953 take-all (Ophiobolus graminis Sacc.) and weeds were severe also.Squareheads Master 13/4 and Cappelle, each sown at 1½ and 3 bushels/acre, were top-dressed at four different dates with ammonium sulphate at 0, 2 and 4 cwt./acre. Sulphuric acid (12½% b.o.v. at 100 gal./ acre) was sprayed on four of the eight blocks of ten plots in March to control eyespot.Halving the seed rate decreased the percentage of severe eyespot from 63 to 52%, decreased the area stunted by take-all from 36 to 14% and increased yield by amounts ranging from 8·3 cwt. in nitrogendeficient plots to 2·6 cwt./acre in plots well supplied with ammonium sulphate. The fertilizer applied to Squareheads Master at 0, 2 and 4 cwt./acre had little effect on the incidence of eyespot lesions at harvest, but increased the area lodged from 23 to 53 and 60% respectively; it decreased the area stunted by takeall from 47 to 19 and 10% respectively, and increased yield from 13 to 17 and 18 cwt./acre. Cappelle did not lodge and the fertilizer decreased take-all patches from 51 to 28 and 18% respectively, and increased grain from 15 to 20 and 21 cwt./acre. The time when nitrogen was applied to either variety had no important effect on disease incidence or yield.Sulphuric acid sprayed in 1953 on blocks unsprayed in 1952 and so having a higher initial infection of eyespot and weeds, decreased the area lodged and the area covered by weeds, but did not decrease the percentage of straws with eyespot below that in unsprayed plots.


Agronomy ◽  
2020 ◽  
Vol 10 (4) ◽  
pp. 596
Author(s):  
Nick R. Bateman ◽  
Angus L. Catchot ◽  
Jeff Gore ◽  
Don R. Cook ◽  
Fred R. Musser ◽  
...  

As fluctuating commodity prices change the agriculture landscape on a yearly basis, soybean (Glycine max (L.) Merr.) has become the predominant crop in the southern USA, accounting for 65 percent of the total row crop production in the state. To accommodate increased soybean production, planting dates have expanded, spanning from late March through July. To determine the impact of this expanded planting window on soybean development and yield, field experiments were conducted at Starkville and Stoneville, MS, in 2013 and 2014. Treatments included seven planting dates ranging from 25 March to 15 July and two soybean cultivars (one Maturity Group IV and one Maturity Group V cultivar). These studies were conducted in irrigated high––yielding environments. Experimental units were sampled weekly for insect pests and insecticides were applied when populations exceeded the levels at which applications were recommended. Planting date had a significant impact on crop development, plant height, canopy closure, and yield. As planting date was delayed, the time required for crop development decreased from 122 total days for plantings on 25 March to 83 days for plantings on 15 July. For plantings after 2 June, plant height decreased by 1.1 cm per day. Canopy closure decreased by 1.01% per day after 27 May. Soybean yield decreased 26.7 kg/ha per day when soybean was planted after 20 April. This research demonstrates the importance of early planting dates for soybean producers in the southern US to ensure profitability by maximizing yield potential.


Plant Disease ◽  
1999 ◽  
Vol 83 (12) ◽  
pp. 1125-1128 ◽  
Author(s):  
D. A. Fritts ◽  
G. J. Michels ◽  
C. M. Rush

Incidence of High Plains Disease (HPD) in a susceptible corn cultivar was examined in relation to planting dates, insecticide treatments, and wheat heading dates during 1994 to 1996. In the High Plains of Texas, this disease of susceptible corn was related to corn planting dates and winter wheat maturity. The incidence of HPD varied greatly from year to year; however, corn planted between 16 and 20 May had the highest disease incidence. Corn planted 10 to 30 days after wheat heading had the highest incidence of the disease. Chemical control of the vector, Aceria tosichella, was ineffective, except by the use of granular insecticides applied at planting, which had some beneficial effects. Results of this study suggest that producers can reduce the incidence of HPD by planting corn before or after the peak migration of wheat curl mite from wheat.


Plant Disease ◽  
2016 ◽  
Vol 100 (8) ◽  
pp. 1692-1708 ◽  
Author(s):  
Richard W. Smiley ◽  
Stephen Machado ◽  
Karl E. L. Rhinhart ◽  
Catherine L. Reardon ◽  
Stewart B. Wuest

Rainfed experiments operated continuously for up to 84 years in semiarid eastern Oregon are among the oldest agronomic trials in North America. Disease incidence and severity had been quantified visually but quantification of inoculum density had not been attempted. Natural inoculum of 17 fungal and nematode pathogens were quantified for each of 2 years on eight trials using DNA extracts from soil. Crop type, tillage, rotation, soil fertility, year, and their interactions had large effects on the pathogens. Fusarium culmorum and Pratylenchus thornei were more dominant than F. pseudograminearum and P. neglectus where spring crops were grown, and the opposite species dominances occurred where winter wheat was the only crop. Bipolaris sorokiniana and Phoma pinodella were restricted to the presence of spring cereals and pulse crops, respectively. Helgardia spp. occurred in winter wheat-fallow rotations but not in annual winter wheat. Gaeumannomyces graminis var. tritici was more prevalent in cultivated than noncultivated soils and the opposite generally occurred for Rhizoctonia solani AG-8. Densities of Pythium spp. clade F were high but were also influenced by treatments. Significant treatment effects and interactions were more prevalent in two long-standing (>50-year) annually cropped experiments (29%) than two long-standing 2-year wheat-fallow rotations (14%). Associations among pathogens occurred mostly in an 84-year-old annual cereals experiment. This survey provided guidance for research on dynamics of root-infecting pathogens of rainfed field crops and identified two pathogens (Drechslera tritici-repentis and P. pinodella) not previously identified at the location.


Sign in / Sign up

Export Citation Format

Share Document