scholarly journals Virulence of Rhizoctonia oryzae on Wheat and Barley Cultivars from the Pacific Northwest

Plant Disease ◽  
2003 ◽  
Vol 87 (1) ◽  
pp. 51-55 ◽  
Author(s):  
T. C. Paulitz ◽  
J. D. Smith ◽  
K. K. Kidwell

Rhizoctonia oryzae (teleomorph = Waitea circinata) causes sheath spot of rice and root rot of wheat and barley. R. oryzae commonly is isolated from barley, wheat, and pea plants in eastern Washington and Idaho. Eight representative isolates were tested for virulence on spring barley (Hordeum vulgare cv. Baronesse), soft white winter wheat (Triticum aestivum cv. Madsen), and hard red spring wheat (cv. Scarlet) planted in natural soil in the greenhouse and maintained at 16°C. All isolates caused significant reduction of emergence in barley, but only seven of the eight isolates and one of the eight isolates reduced emergence of winter wheat and spring wheat, respectively. All isolates caused significant stunting and reduction in the number of seminal roots, root length, and number of root tips on wheat and barley. Some isolates also reduced the frequency of fine secondary roots, resulting in a reduction of the average root diameter. Spring barley was more susceptible to R. oryzae than winter or spring wheat. The main effects of both cultivar and isolate were significant, and there was a significant isolate-cultivar interaction. R. oryzae isolate 80042 was the most virulent on barley, whereas R. oryzae isolate 801387 was the most virulent on wheat. The two isolates from pea were intermediate in virulence on wheat and barley. When screening germ plasm for potential resistance, isolates exhibiting the maximum virulence for each host should be used.

Plant Disease ◽  
2002 ◽  
Vol 86 (4) ◽  
pp. 442-442 ◽  
Author(s):  
T. C. Paulitz

In May 2001, severe stunting, lateral rot, and brown discoloration of taproots were observed in a field of direct-seed (no-till) pea cv. Columbia southeast of Lewiston, ID. The field had been previously cropped with direct-seeded spring barley. Roots were washed, plated on water agar containing benomyl at 1 μ/ml and chloramphenicol at 100 μg/ml, and incubated at 22°C. Fungal colonies were identified as Rhizoctonia oryzae (teleomorph Waitea circinata Warcup & Talbot) based on hyphal and colony morphology (3) and anastamosis reaction with known tester isolates. Two isolates were grown on autoclaved oat seeds for 3 weeks to produce inoculum for pathogenicity testing. One colonized oat seed was placed below a seed of Pisum sativum ‘Little Marvel’ planted in pasteurized sandy loam soil. There were five pea seeds per 10-cm-diameter pot and three replicate pots per isolate. Both isolates caused severe damping-off and stunting. Both isolates were also tested in nonpasteurized (natural) sandy loam in 4 cm × 20 cm plastic pine seedling tubes. Eight colonized oat seeds were placed in a band 1 cm below a single pea seed planted in each tube. Tubes were watered with metalaxyl (0.1g/liter, technical grade) to inhibit Pythium. Control treatments consisted of soil amended with either autoclaved oat seeds or nothing. Two isolates of R. oryzae were tested with two pea cultivars (B160 and Marjorette), with five replicates per treatment. R. oryzae did not significantly reduce emergence but did cause necrosis and browning of root tips and reduction in lateral root formation. R. oryzae was reisolated from infected roots. To our knowledge, this is the first report of R. oryzae causing disease on a dicot in North America. In Australia, a Waitea sp. was weakly virulent to subterranean clover producing constrictions of the taproot but did not affect plant survival and growth (4). W. circinata also caused damping-off of tobacco seedlings in India (2). In the Pacific Northwest, peas are often grown in rotation with wheat and barley, and R. oryzae can be virulent on these cereal crops (1). This finding may have important implications for disease management in wheat and legumes in crop rotation systems. References: (1). M. Mazzola et al. Phytopathology 86:354, 1996. (2) C. A. Raju. Tob. Res. 19:92, 1993. (3) B. Sneh et al. Identification of Rhizoctonia Species. The American Phytopathological Society, St. Paul, MN, 1991. (4) D. H. Wong et al. Trans. Br. Mycol. Soc. 85:156, 1985.


Plant Disease ◽  
2013 ◽  
Vol 97 (4) ◽  
pp. 537-546 ◽  
Author(s):  
Richard W. Smiley ◽  
Stephen Machado ◽  
Jennifer A. Gourlie ◽  
Larry C. Pritchett ◽  
Guiping Yan ◽  
...  

There is interest in converting rainfed cropping systems in the Pacific Northwest from a 2-year rotation of winter wheat and cultivated fallow to direct-seed (no-till) systems that include chemical fallow, spring cereals, and food legume and brassica crops. Little information is available regarding effects of these changes on plant-parasitic nematodes. Eight cropping systems in a low-precipitation region (<330 mm) were compared over 9 years. Each phase of each rotation occurred each year. The density of Pratylenchus spp. was greater in cultivated than chemical fallow, became greater with increasing frequency of host crops, and was inversely associated with precipitation (R2 = 0.92, α < 0.01). Densities after harvesting mustard, spring wheat, winter wheat, and winter pea were greater (α < 0.01) than after harvesting spring barley or spring pea. Camelina also produced low densities. Winter wheat led to a greater density of Pratylenchus neglectus and spring wheat led to a greater density of P. thornei. Density of Pratylenchus spp. was correlated (R2 = 0.88, α < 0.01) but generally higher when detected by real-time polymerase chain reaction on DNA extracts from soil than when detected by a traditional method. Selection of different Pratylenchus spp. by different wheat cultivars or growth habit must be addressed to minimize the level of nematode risk to future plantings of intolerant crops.


1980 ◽  
Vol 95 (3) ◽  
pp. 583-595 ◽  
Author(s):  
A. Penny ◽  
F. V. Widdowson

SUMMARYAn experiment at Rothamsted during 1958–67 measured effects on yield, on K uptake and on soil K of applying all combinations of 38, 75 and 113 kg N and 0, 31 and 62 kg K/ha per cut to grass leys, which were cut and removed. Soil K was depleted most where most N and least K were given. Annual applications of 0, 33 and 66 kg P/ha were also tested; soil P was not depleted. The grass was then ploughed.In 1968, residual effects were measured by spring wheat. In 1969 and in 1970 104 kg/ha of fresh K was applied on half of each plot; potatoes (1969) and spring wheat (1970) valued residual and fresh effects of K.In 1971 potatoes tested 0, 104 and 208 kg/ha of fresh K, cumulatively with the three amounts given to the grass and also extra K (104 kg/ha) on half-plots, cumulatively with that given in 1969 and 1970. In 1972 winter wheat, and in 1974 and 1975 spring barley, measured residues of all treatments previously applied (the site was fallowed in 1973).Finally, in 1976, potatoes tested 0, 156 and 312 kg/ha of fresh K on whole plots, cumulatively with the previous dressings of K, and also 156 kg/ha of extra K on half-plots, again cumulatively. All these test crops were given basal N.Yields and K contents of wheat at ear emergence and yields of wheat grain were largest after grass given 38 kg N and 62 kg K/ha per cut, because here soil K depletion was least. Wheat grain yields benefited consistently from fresh K. K content of the wheat at ear emergence was a good indicator of the need for K, but K content of grain was not, because it was unaltered by K fertilizer. Barley was a poor test crop for K, because yields of grain were little affected by previous treatments.Percentage K in potato leaves (in July in 1969 and 1971, in August in 1976) and yield of tubers were well correlated. Largest yields in 1969, 1971 and 1976 came where the leaves contained 3·43, 3·76 and 2·82% K, respectively, i.e. from soil containing most exchangeable K, plus most fresh K. There was no indication that maximum yields had been obtained, so the largest amounts (kg/ha) of fresh K tested (104 in 1969, 312 in 1971 and 468 in 1976) were insufficient to counteract depletion of soil K by the grass. Because the grass did not deplete soil P, the test crops benefited only little from either residual or fresh P.


2010 ◽  
Vol 56 (No. 1) ◽  
pp. 28-36 ◽  
Author(s):  
J. Černý ◽  
J. Balík ◽  
M. Kulhánek ◽  
K. Čásová K ◽  
V. Nedvěd

In long-term stationary experiments under different soil-climatic conditions, an influence of mineral and organic fertilization on yield of winter wheat, spring barley and potato tubers was evaluated. Statistically significantly lowest grain yields of winter wheat (4.00 t/ha) and spring barley (2.81 t/ha) were obtained in non-fertilized plots at all experimental sites. In the case of potatoes, the lowest yield of dry matter (5.71 t/ha) was recorded in the control plot, but the result was not statistically significant. The manure-fertilized plot gave the average yield of wheat higher by 30%, of barley by 22%. Application of sewage sludge resulted in wheat yield higher by 41% and barley yield higher by 26% over control. On average, application of sewage sludge and manure increased the yield of potatoes by 30% over control. The highest yield was obtained after application of mineral fertilizers; average yield increased by 59, 50 and 36% in winter wheat, spring barley and potatoes, respectively. No statistically significant differences among the plots with mineral fertilizers were observed. At different sites, the yield of studied crops varied; however, the effect of fertilization on yield increments was similar at all experimental sites except for Lukavec. It is the site with the lowest natural soil fertility, and it showed the highest effect of the applied fertilizers.


1980 ◽  
Vol 3 ◽  
pp. 25-31
Author(s):  
J. F. D. Greenhalgh

The most widely-quoted estimates of straw supplies and usage in England and Wales are those of a working party of the National Farmers Union (1973). They assumed the yield of straw to be 2.8 t/ha, and hence 9.3 Mt from 3.4 M ha of cereals in 1972. (The same yield from 3.7 M ha of cereals in the UK would give 10.4 Mt.) Of the 9.3 Mt, 37% was estimated to be burned in the field or ploughed in, 36% used for bedding, 15% used for feed, and 12% used for other purposes. The figure of 2.4 t/ha (1 t/acre) may well be too low. Short (1974) found straw yields at four Experimental Husbandry Farms over several years to be as follows (t/ha): winter wheat 3.71, spring wheat 4.68, spring barley 2.71, and spring oats 4.54. Wood (1974) surveyed wheat crops in Oxfordshire in 1973 and found yields of 3.7 t/ha. The total quantity of straw available is therefore likely to be considerably in excess of 9.3 Mt and could if necessary be increased further by cutting at a lower level. The accuracy of the National Farmers Union estimate of 0.15 × 10.4 = 1.6 Mt used for animal feeding is also questionable, but this amount would — if it contained 6.5 MJ metabolizable energy (ME)/kg dry matter (DM) — be sufficient to provide only about 7% of the maintenance requirements of all cattle in Britain. On a larger scale, Balch (1977) has calculated that if all the straw grown in Europe were improved by chemical treatment it could provide 80 to 90% of the maintenance requirements of Europe's ruminant livestock. World estimates for the production of straw and other fibrous wastes are given by Owen (1976).


Author(s):  
Uta McKelvy ◽  
Monica Brelsford ◽  
Jamie Sherman ◽  
Mary Burrows

Wheat streak mosaic virus (WSMV) causes sporadic epidemics in Montana which can threaten profitability of the state’s small grains production. One challenge for WSMV management in Montana is that most commercially available wheat and barley cultivars are susceptible to WSMV or their performance under WSMV pressure is unknown. In a three-year field study from 2017 to 2019 winter wheat, spring wheat, and barley cultivars were evaluated for their susceptibility to WSMV and yield performance under WSMV pressure. Plants were mechanically inoculated and WSMV incidence was assessed using DAS-ELISA. There was effective resistance to WSMV in breeding line CO12D922, which had consistently low WSMV incidence, highlighting promising efforts in the development of WSMV-resistant winter wheat cultivars. Moderate WSMV incidence and minor yield losses were observed from WSMV infection of commercial winter wheat ‘Brawl CL Plus’ and MSU breeding line MTV1681. Spring wheat cultivars in this study had high WSMV incidence of up to 100 % in ‘Duclair,’ ‘Egan,’ and ‘McNeal.’ High WSMV incidence was associated with severe yield losses as high as 85 % for Duclair and ‘WB9879CL’ in 2019, demonstrating a high degree of susceptibility to WSMV inoculation. Barley cultivars had considerably lower WSMV incidence compared to spring and winter wheat. Grain yield response to WSMV inoculation was variable between barley cultivars. The study provided an experimental basis for cultivar recommendations for high WSMV pressure environments and identified breeding lines and cultivars with potential resistance traits of interest to breeding programs that aim to develop WSMV-resistant cultivars.


2006 ◽  
Vol 20 (3) ◽  
pp. 658-669 ◽  
Author(s):  
Frank L. Young ◽  
Mark E. Thorne ◽  
Douglas L. Young

No-till cropping is an option for growers needing to reduce soil erosion in the Palouse annual-cropped region of the Pacific Northwest, which is well suited for wheat production. A 6-yr field study was conducted to determine optimum levels of fertilizer and herbicide inputs in a no-till continuous wheat crop production system. Three levels of nitrogen (N) and two weed management levels (WML) were compared in a spring wheat (SW)–winter wheat (WW)–WW rotation through two rotation cycles. The high WML reduced weed densities about 50% compared with the low WML. In general, herbicide treatments were more effective on broadleaf weeds and may have facilitated a shift toward grass weeds. The high WML reduced grass weed biomass only at the reduced N levels, whereas the high WML reduced broadleaf weed density at all N levels. Variable environmental conditions affected wheat yield; however, yield tended to be highest where winter wheat immediately followed spring wheat. Nitrogen had little effect on weed density but increased crop yield about 13% with each increased N level. Crop yield was greater at the high versus low WML at each N level, even though weed density and biomass were reduced least between WMLs at the highest N level. The highest crop yield and net returns were obtained with the highest N and WML; however, none of the N and WML combinations were profitable.


2019 ◽  
Vol 10 (1) ◽  
pp. 107-121 ◽  
Author(s):  
J. Salonen ◽  
E. Ketoja

Abstract Adoption of reduced tillage in organic cropping has been slow, partly due to concerns about increasing weed infestation. Undersown cover crops (CCs) are considered to be a feasible option for weed management but their potential for weed suppression is insufficiently investigated in low-till organic cropping. The possibilities to reduce primary tillage by introducing CCs to maintain weed infestation at a level that does not substantially jeopardize crop yield were studied in a field experiment in southern Finland during 2015–2017. Eight different CC mixtures were undersown in cereals and the response in weed occurrence was consecutively assessed in spring barley, winter wheat, and finally, as a subsequent effect, in spring wheat. Growth of CCs was too slow to prevent the flush of early emerging weeds in spring barley whereas in winter wheat, CCs succeeded in hindering the growth of weeds. However, CCs could not prevent the increase of perennial weeds in a reduced tillage system in which the early growth of spring wheat was retarded in cool 2017. Consequently, after 2 years of reduced tillage, weed biomass was about 2.6 times higher and spring wheat yield was 30% lower than in plowed plots, respectively. No major differences in weed control efficacy among CC treatments were evident. A grain yield benefit was recorded after repeated use of leguminous CCs. The need for long-term field studies remains of particular interest regarding post-harvest performance and influence of CCs on perennial weeds before the inversion tillage.


2011 ◽  
Vol 101 (5) ◽  
pp. 544-554 ◽  
Author(s):  
D. Sharma-Poudyal ◽  
X. M. Chen

Climatic variation in the U.S. Pacific Northwest (PNW) affects epidemics of wheat stripe rust caused by Puccinia striiformis f. sp. tritici. Previous models only estimated disease severity at the flowering stage, which may not predict the actual yield loss. To identify weather factors correlated to stripe rust epidemics and develop models for predicting potential yield loss, correlation and regression analyses were conducted using weather parameters and historical yield loss data from 1993 to 2007 for winter wheat and 1995 to 2007 for spring wheat. Among 1,376 weather variables, 54 were correlated to yield loss of winter wheat and 18 to yield loss of spring wheat. Among the seasons, winter temperature variables were more highly correlated to wheat yield loss than the other seasons. The sum of daily temperatures and accumulated negative degree days of February were more highly correlated to winter wheat yield loss than the other monthly winter variables. In addition, the number of winter rainfall days was found correlated with yield loss. Six yield loss models were selected for each of winter and spring wheats based on their better correlation coefficients, time of weather data availability during the crop season, and better performance in validation tests. Compared with previous models, the new system of using a series of the selected models has advantages that should make it more suitable for forecasting and managing stripe rust in the major wheat growing areas in the U.S. PNW, where the weather conditions have become more favorable to stripe rust.


Sign in / Sign up

Export Citation Format

Share Document