Effect of spot-type net blotch (Drechslera teres (Sacc.) Shoem) infection on barley yield in short season environment of northern cereal belt of Western Australia

1989 ◽  
Vol 40 (4) ◽  
pp. 745 ◽  
Author(s):  
TN Khan

The effect of spot-type net blotch (Drechslera teres (Sacc.) Shoem.) on yield was studied in fourteen field experiments located at two sites over seven years in the area where the disease occurs; the northern cereal belt of south-west Western Australia. An overall reduction of 26% in grain yield occurred associated with spot-type net blotch infection. The yield losses varied depending upon season, date of sowing and cultivar. Disease was found to reduce 100 grain weight and number of ears/m2, but number of grains/ear were not affected. Regression analysis supported the above negative effect of disease on yield in general, but in a few cases disease and yield were positively correlated. The Area Under Curve (AUC) model was considered most appropriate and percentage yield loss (L) was estimated as L = 0.0233 AUC. Using this relationship, potential losses in yields of cvv. Beeeher and O'Connor were estimated to be 34% and 29%, respectively. The application of this relationship is suggested to be limited to short season environments similar to the northern cereal belt of Western Australia. A need to understand factors which modify the yield response to this disease is highlighted.

1987 ◽  
Vol 38 (4) ◽  
pp. 671 ◽  
Author(s):  
TN Khan

Losses in the yield of cv. Dampier due to net blotch (Drechslera teres (Sacc.) Shoem.) were examined in six environments in Western Australia. Based on comparison between least diseased and most diseased treatments, there was an overall yield reduction of 21% (P < 0.05). Three models (Critical Point, Area Under Curve and Multiple Point) were used to study the relationship between net blotch infection and percentage yield loss. All models gave similar results. Because of its simplicity, the Critical Point Model based on mean net blotch infection on the top three leaves at GS 75 was chosen. The percentage yield loss in cv. Dampier was defined to be 37% of the mean diseased area on leaves 1 (flag), 2 and 3 at GS 75. This relationship is very similar to that developed earlier for scald, and a common equation for both scald and net blotch was suggested.


1985 ◽  
Vol 36 (5) ◽  
pp. 655 ◽  
Author(s):  
TN Khan ◽  
MF D'Antuono

The three commonly used techniques, viz. critical point model, area under the curve and multiple linear regression, were applied to study the relationship between scald infection and grain yield in field experiments conducted during 1979-1983 in Western Australia. In the preliminary analysis leaf three from the top and the mean of the top three leaves were found to be best correlated with yield. The three models did not dilfer greatly, presumably owing to the high correlations between scald at the milky ripe stage and at the earlier growth stages. The critical point model was chosen because of its simplicity. Percentage yield loss in combined data from all experiments showed a significant correlation (P < 0.001) with scald at the milky ripe stage and defined percentage yield loss in cultivars Clipper and Stirling to be about one-third of the mean scald damage on leaves 1 (flag), 2 and 3 at g.s. 75. Due to the range of trials in this analysis, it was suggested that this relationship may be applied to estimate yield loss from survey data in other parts of southern Australia, where scald is endemic.


2002 ◽  
Vol 42 (2) ◽  
pp. 149 ◽  
Author(s):  
M. D. A. Bolland ◽  
W. J. Cox ◽  
B. J. Codling

Dairy and beef pastures in the high (>800 mm annual average) rainfall areas of south-western Australia, based on subterranean clover (Trifolium subterraneum) and annual ryegrass (Lolium rigidum), grow on acidic to neutral deep (>40 cm) sands, up to 40 cm sand over loam or clay, or where loam or clay occur at the surface. Potassium deficiency is common, particularly for the sandy soils, requiring regular applications of fertiliser potassium for profitable pasture production. A large study was undertaken to assess 6 soil-test procedures, and tissue testing of dried herbage, as predictors of when fertiliser potassium was required for these pastures. The 100 field experiments, each conducted for 1 year, measured dried-herbage production separately for clover and ryegrass in response to applied fertiliser potassium (potassium chloride). Significant (P<0.05) increases in yield to applied potassium (yield response) were obtained in 42 experiments for clover and 6 experiments for ryegrass, indicating that grass roots were more able to access potassium from the soil than clover roots. When percentage of the maximum (relative) yield was related to soil-test potassium values for the top 10 cm of soil, the best relationships were obtained for the exchangeable (1 mol/L NH4Cl) and Colwell (0.5 mol/L NaHCO3-extracted) soil-test procedures for potassium. Both procedures accounted for about 42% of the variation for clover, 15% for ryegrass, and 32% for clover + grass. The Colwell procedure for the top 10 cm of soil is now the standard soil-test method for potassium used in Western Australia. No increases in clover yields to applied potassium were obtained for Colwell potassium at >100 mg/kg soil. There was always a clover-yield increase to applied potassium for Colwell potassium at <30 mg/kg soil. Corresponding potassium concentrations for ryegrass were >50 and <30 mg/kg soil. At potassium concentrations 30–100 mg/kg soil for clover and 30–50 mg/kg soil for ryegrass, the Colwell procedure did not reliably predict yield response, because from nil to large yield responses to applied potassium occurred. The Colwell procedure appears to extract the most labile potassium in the soil, including soluble potassium in soil solution and potassium balancing negative charge sites on soil constituents. In some soils, Colwell potassium was low indicating deficiency, yet plant roots may have accessed potassum deeper in the soil profile. Where the Colwell procedure does not reliably predict soil potassium status, tissue testing may help. The relationship between relative yield and tissue-test potassium varied markedly for different harvests in each year of the experiments, and for different experiments. For clover, the concentration of potassium in dried herbage that was related to 90% of the maximum, potassium non-limiting yield (critical potassium) was at the concentration of about 15 g/kg dried herbage for plants up to 8 weeks old, and at <10 g/kg dried herbage for plants older than 10–12 weeks. For ryegrass, there were insufficient data to provide reliable estimates of critical potassium.


2001 ◽  
Vol 52 (2) ◽  
pp. 295 ◽  
Author(s):  
R. A. Latta ◽  
L. J. Blacklow ◽  
P. S. Cocks

Two field experiments in the Great Southern region of Western Australia compared the soil water content under lucerne (Medicago sativa) with subterranean clover (Trifolium subterranean) and annual medic (Medicago polymorpha) over a 2-year period. Lucerne depleted soil water (10–150 cm) between 40 and 100 mm at Borden and 20 and 60 mm at Pingrup compared with annual pasture. There was also less stored soil water after wheat (Triticum aestivum) and canola (Brassica napus) phases which followed the lucerne and annual pasture treatments, 30 and 48 mm after wheat, 49 and 29 mm after canola at Borden and Pingrup, respectively. Lucerne plant densities declined over 2 seasons from 35 to 25 plants/m2 (Borden) and from 56 to 42 plants/m2 (Pingrup), although it produced herbage quantities similar to or greater than clover/medic pastures. The lucerne pasture also had a reduced weed component. Wheat yield at Borden was higher after lucerne (4.7 t/ha) than after annual pasture (4.0 t/ha), whereas at Pingrup yields were similar (2 t/ha) but grain protein was higher (13.7% compared with 12.6%) . There was no yield response to applied nitrogen after lucerne or annual pasture at either site, but it increased grain protein at both sites. There was no pasture treatment effect on canola yield or oil content at Borden (2 t/ha, 46% oil). However, at Pingrup yield was higher (1.5 t/ha compared to 1.3 t/ha) and oil content was similar (41%) following lucerne–wheat. The results show that lucerne provides an opportunity to develop farming systems with greater water-use in the wheatbelt of Western Australia, and that at least 2 crops can be grown after 3 years of lucerne before soil water returns to the level found after annual pasture.


2002 ◽  
Vol 53 (10) ◽  
pp. 1155 ◽  
Author(s):  
I. Farré ◽  
M. J. Robertson ◽  
G. H. Walton ◽  
S. Asseng

Canola is a relatively new crop in the Mediterranean environment of Western Australia and growers need information on crop management to maximise profitability. However, local information from field experiments is limited to a few seasons and its interpretation is hampered by seasonal rainfall variability. Under these circumstances, a simulation model can be a useful tool. The APSIM-Canola model was tested using data from Western Australian field experiments. These experiments included different locations, cultivars, and sowing dates. Flowering date was predicted by the model with a root mean squared deviation (RMSD) of 4.7 days. The reduction in the period from sowing to flowering with delay in sowing date was accurately reproduced by the model. Observed yields ranged from 0.1 to 3.2 t/ha and simulated yields from 0.4 to 3.0 t/ha. Yields were predicted with a RMSD of 0.3–0.4 t/ha. The yield reduction with delayed sowing date in the high, medium, and low rainfall region (3.2, 6.1, and 8.6% per week, respectively) was accurately simulated by the model (1.1, 6.7, and 10.3% per week, respectively). It is concluded that the APSIM-Canola model, together with long-term weather data, can be reliably used to quantify yield expectation for different cultivars, sowing dates, and locations in the grainbelt of Western Australia.


2009 ◽  
Vol 23 (4) ◽  
pp. 503-506 ◽  
Author(s):  
John D. Everitt ◽  
J. Wayne Keeling

Field experiments were conducted in Hale Co., TX, in 2005 and 2006 to determine the effects of 2,4-D amine and dicamba applied at varying rates and growth stages on cotton growth and yield, and to correlate cotton injury levels and lint yield reductions. Dicamba or 2,4-D amine was applied at four growth stages including cotyledon to two-leaf, four- to five-leaf, pinhead square, and early bloom. Dicamba and 2,4-D amine were applied at 1/2, 1/20, 1/200, and 1/2000 of the recommended use rate. Crop injury was recorded at 14 days after treatments and late-season, and cotton lint yields were determined. Across all growth stages, 2,4-D caused more crop injury and yield loss than dicamba. Cotton lint was reduced more by later applications (especially pinhead square) and injury underestimated yield loss with 2,4-D. Visual estimates of injury overestimated yield loss when 2,4-D or dicamba was applied early (cotyledon to two leaf) and was not a good predictor of yield loss.


1995 ◽  
Vol 9 (1) ◽  
pp. 91-98 ◽  
Author(s):  
K. Neil Harker ◽  
Robert E. Blackshaw ◽  
Ken J. Kirkland

Field experiments were conducted from 1986 to 1988 at Lacombe and Lethbridge, Alberta and Scott, Saskatchewan to determine growth and yield response of canola to mixtures of ethametsulfuron with specific grass herbicides. Ethametsulfuron did not usually cause canola injury when mixed with sethoxydim. However, ethametsulfuron mixtures with the following grass herbicides listed in decreasing order of injury potential, often caused canola injury and yield loss: haloxyfop > fluazifop > fluazifop-P > quizalofop > quizalofop-P. Canola yield losses were severe in some experiments, ranging from 59% with quizalofop mixtures to 97% with haloxyfop mixtures; in other experiments, the same mixtures did not cause significant yield losses. ‘Tobin,’ aBrassica rapacultivar, tended to be more susceptible to injury than theB. napuscultivars ‘Pivot’ and ‘Westar.’ Canola injury symptoms were consistent with those expected from sulfonylurea herbicides. Therefore, we suggest that specific grass herbicides differentially impair the ability of canola to metabolize ethametsulfuron to inactive forms.


1997 ◽  
Vol 48 (5) ◽  
pp. 595 ◽  
Author(s):  
K. L. Regan ◽  
K. H. M. Siddique ◽  
D. Tennant ◽  
D. G. Abrecht

Wheat cultivars with very early maturities appropriate for late sowings in low-rainfall (<325 mm) short-season environments are currently unavailable to wheat growers in the eastern margin of the cropping region of Western Australia. A demonstration that very early-maturing genotypes can out-perform current commercial cultivars would open new opportunities for breeding programs to select very early-maturing, high- and stable-yielding cultivars for these environments. Six field experiments were conducted over 4 seasons at 2 low-rainfall sites in Western Australia to investigate crop growth, grain yield, and water use efficiency of very early-maturing genotypes compared with current commercial cultivars when sown after 1 June. Very early-maturing genotypes reached anthesis up to 24 days (328 degree-days) earlier than the current cultivars, produced less leaves, had similar yields and dry matter, and maintained high water use efficiencies. On average across seasons and locations the very early-maturing genotypes (W87–022–511, W87–114–549, W87–410–509) yielded more than the later maturing cultivars Gamenya and Spear (190 v. 160 g/m2) but they were similar to the early-maturing commercial cultivars Kulin and Wilgoyne (191 g/m2). Very early-maturing genotypes generally had a higher harvest index and produced fewer spikelets, but heavier and more grains, than Kulin and Wilgoyne. There were only small differences in total water use between very early-maturing genotypes and commercial cultivars; however, very early-maturing genotypes used less water in the pre-anthesis period and more water in the post-anthesis period than the later maturing genotypes, and hence, experienced less water deficit during the grain-filling period. This study indicates that there is a role for very early-maturing genotypes in low-rainfall short-season environments, when the first autumn rains arrive late (after 1 June).


2018 ◽  
Vol 32 (4) ◽  
pp. 431-438 ◽  
Author(s):  
Xiao Li ◽  
Timothy Grey ◽  
William Vencill ◽  
James Freeman ◽  
Katilyn Price ◽  
...  

AbstractFomesafen provides effective control of glyphosate-resistant Palmer amaranth in cotton. However, cotton seedlings can be injured when fomesafen is applied PRE. Therefore, greenhouse and field experiments were conducted at Athens, GA, and at six locations in Alabama and Georgia in 2013 and 2016 to evaluate cotton growth and yield response to fomesafen applied PRE at 70, 140, 280, 560, 1,120, or 2,240 g ai ha−1, and in combination with pendimethalin, diuron, acetochlor, and fluridone at 1×label rates. Greenhouse bioassays indicated that fomesafen reduced cotton height and dry weight with increasing rate in Cecil sandy loam and Tifton loamy sand but not in Greenville sandy clay loam––possibly as a result of this soil’s higher organic matter (OM) and clay content. Fomesafen applied at 2,240 g ai ha−1 reduced cotton stand by as much as 83% compared to the nontreated check (NTC) at all field locations except Alabama’s Macon and Baldwin counties, and 1,120 g ai ha−1 reduced cotton stand only at Pulaski County, GA, by 52%. Cotton height was reduced by the two highest rates of fomesafen at all locations except Clarke County, GA, and Baldwin County, AL. Injury data indicated more visual injury followed increasing fomesafen rates, and high-rate treatments produced more injury in sandier soils. Cotton yield was unaffected by herbicide treatments at any location, except for the 1,120 g ai ha−1 rate at Pulaski County (49% yield loss compared to NTC), 2,240 g ai ha−1 at Pulaski County (72% yield loss), and Tift County (29% yield loss). These data indicated cotton yield should not be negatively affected by fomesafen applied PRE alone within label rates or in combination with pendimethalin, diuron, acetochlor, and fluridone at 1×label rates, although some visual injury, or stand or height reduction may occur early in the growing season.


Genome ◽  
2006 ◽  
Vol 49 (7) ◽  
pp. 855-859 ◽  
Author(s):  
T L Friesen ◽  
J D Faris ◽  
Z Lai ◽  
B J Steffenson

Net blotch, caused by Pyrenophora teres, is one of the most economically important diseases of barley worldwide. Here, we used a barley doubled-haploid population derived from the lines SM89010 and Q21861 to identify major quantitative trait loci (QTLs) associated with seedling resistance to P. teres f. teres (net-type net blotch (NTNB)) and P. teres f. maculata (spot-type net blotch (STNB)). A map consisting of simple sequence repeat (SSR) and amplified fragment length polymorphism (AFLP) markers was used to identify chromosome locations of resistance loci. Major QTLs for NTNB and STNB resistance were located on chromosomes 6H and 4H, respectively. The 6H locus (NTNB) accounted for as much as 89% of the disease variation, whereas the 4H locus (STNB resistance) accounted for 64%. The markers closely linked to the resistance gene loci will be useful for marker-assisted selection.Key words: disease resistance, Drechslera teres, molecular markers.


Sign in / Sign up

Export Citation Format

Share Document