Effect of timing of pasture grass removal on subsequent take-all incidence and yield in wheat in southern New South Wales

2002 ◽  
Vol 42 (8) ◽  
pp. 1087 ◽  
Author(s):  
C. R. Kidd ◽  
G. M. Murray ◽  
J. E. Pratley ◽  
A. R. Leys

Winter cleaning is the removal of grasses from pasture using selective herbicides applied during winter. We compared the effectiveness of an early (June) and late (July) winter cleaning with an early spring herbicide fallow (September), spring (October) herbicide and no disturbance of the pasture on development of the root disease take-all in the subsequent wheat crop. Experiments were done at 5 sites in the eastern Riverina of New South Wales in 1990 and 1991. The winter clean treatments reduced soil inoculum of Gaeumannomyces graminis var. tritici (Ggt) compared with the other treatments at all sites as measured by a bioassay, with reductions from the undisturbed treatments of 52–79% over 5 sites. The winter clean treatments also significantly reduced the amount of take-all that developed in the subsequent wheat crop by between 52 and 83%. The early and late winter clean treatments increased the number of heads/m2 at 3 and 1 sites, respectively. Dry matter at anthesis was increased by the winter clean treatments at 3 sites. Grain yield was increased by the winter cleaning treatments over the other treatments at the 4 sites harvested, with yield increases of the early winter clean over the undisturbed treatment from 13 to 56%. The autumn bioassay of Ggt was positively correlated with spring take-all and negatively correlated with grain yield of the subsequent wheat crop at each site. However, there was a significant site and site × bioassay interaction so that the autumn bioassay could not be used to predict the amount of take-all that would develop.


1987 ◽  
Vol 27 (3) ◽  
pp. 411 ◽  
Author(s):  
GM Murray ◽  
BJ Scott ◽  
Z Hochman ◽  
BJ Butler

Lime was applied at rates from 0 to 5.0 t ha-1 at 4 sites in southern and central New South Wales. A root and crown disease characterised by basal stem blackening affected up to 60% of wheat plants and 80% of triticale plants when the soil pH in 0.01 mol L-1 CaCl2 was above 5.0 at all 4 sites. Below pH 4.8, incidence was less than 5%. The take-all fungus, Gaeumannomyces graminis var. tritici, was consistently associated with this symptom. Losses in grain yield from the disease ranged from 26 to 77% depending on site. Regression analysis indicates that each 10% increase in plants with basal stem blackening decreased yield by 0.76%. These results demonstrate that the disease can reverse the expected increase in yield after liming, and that progressive acidification of the soils in the region may have caused the present reduced amount of take-all.



1986 ◽  
Vol 26 (6) ◽  
pp. 709 ◽  
Author(s):  
AC Taylor ◽  
WJ Lill

Regular hand-weeding was undertaken in experiments located in 167 wheat crops in southern New South Wales from 1967 to 1970 to quantify the effect of weeds on 10 wheat attributes at flowering or maturity. Short annual grasses, skeleton weed, wild oats and annual legumes were the most widespread weeds, all of which tended to occur in mixed stands. At wheat flowering, over all sites, wheat DM, nitrogen concentration, nitrogen uptake, phosphorus uptake and number of ears were increased (P< 0.05) by 11.2, 3.3, 14.4, 13.6 and 7.8%, respectively by weeding; wheat phosphorus concentrations did not respond to weeding. At maturity, grain yield and nitrogen yield increased after weeding (P< 0.05) by 17.3 and 1 7.0%, respectively, but grain protein and kernel weight did not respond to weeding. Regression procedures were used to relate wheat responses to total weed DM and the DM of 8 weed classes. At flowering, for every 100 g of DM removed, wheat DM, nitrogen uptake, phosphorus uptake and ear number increased by 52.3 g m-2, 958 mg m-2, 92.6 mg m-2and 18.7 m-2, respectively. At maturity, grain yield and grain nitrogen yield increased by 31.9 g m-2 and 665 mg m-2, respectively, for every 100g m-2 of weed DM present at flowering. The regressions also showed that, at both flowering and maturity, fumitory, annual grasses and sundry weeds (a group made up of weeds not sufficiently widespread to consider separately) appeared to be the most aggressive weeds. Consideration of standardised responses of the wheat attributes increased by weeding showed that they all responded similarly when corrected for scale of measurement.



1995 ◽  
Vol 35 (7) ◽  
pp. 915 ◽  
Author(s):  
WL Felton ◽  
H Marcellos ◽  
RJ Martin

Four experiments were commenced after a 1980 wheat crop, and a fifth after the 1981 crop, at different sites representing the major soil types of northern New South Wales in the 550-700 mm rainfall zone, to examine the influence of 3 fallow management practices [no tillage (NT); stubble retention after harvest, cultivation (SM); stubble burning after harvest, cultivation (SB)] on wheat production. Data considered in this paper cover the continuous wheat subtreatments of the 5 experiments (1981-90). Nitrogen applied at 50 kg Nlha in addition to the basal treatment was included as a treatment from 1986 to 1988. Across all sites and seasons, grain yields were in the order SB>SM = NT, stubble retention having a greater effect than tillage. In some years at some sites, differences in grain yield and grain N yield were not significant. In others, when significant yield differences occurred, variations in grain yield and grain N yield were highly correlated with differences in soil N available for the crop. The data show that the influence of fallow management interacted with season and crop nutrition, and required long-term study for proper assessment.



1983 ◽  
Vol 23 (120) ◽  
pp. 103
Author(s):  
JE Pratley

The control by herbicides of an infestation of Amsinckia hispida and toadrush (Juncus bufonius) in wheat was investigated at Wagga Wagga, New South Wales, during 1979 and 1980. Bromoxynil, bromoxynil+MCPA, terbutryne and methabenzthiazuron+2,4-D were used in both years, dicamba+MCPA in 1979, and dicamba and experimental herbicide DPX4189 (GleanTM�) in 1980. All herbicides reduced weed densities and improved crop yields. Terbutryne gave greatest control of weed populations, in excess of 98% in both years. Grain yield was more than doubled in each case. GleanTM produced the highest grain yield in 1980 although weed control was not as good as from some other herbicides. However, the undersown pasture legumes, particularly subterranean clover, had poorer survival from this herbicide. Dicamba and dicamba+MCPA were inferior to the other chemicals in the control of these weeds.



2015 ◽  
Vol 66 (4) ◽  
pp. 349 ◽  
Author(s):  
Julianne M. Lilley ◽  
Lindsay W. Bell ◽  
John A. Kirkegaard

Recent expansion of cropping into Australia’s high-rainfall zone (HRZ) has involved dual-purpose crops suited to long growing seasons that produce both forage and grain. Early adoption of dual-purpose cropping involved cereals; however, dual-purpose canola (Brassica napus) can provide grazing and grain and a break crop for cereals and grass-based pastures. Grain yield and grazing potential of canola (up until bud-visible stage) were simulated, using APSIM, for four canola cultivars at 13 locations across Australia’s HRZ over 50 years. The influence of sowing date (2-weekly sowing dates from early March to late June), nitrogen (N) availability at sowing (50, 150 and 250 kg N/ha), and crop density (20, 40, 60, 80 plants/m2) on forage and grain production was explored in a factorial combination with the four canola cultivars. The cultivars represented winter, winter × spring intermediate, slow spring, and fast spring cultivars, which differed in response to vernalisation and photoperiod. Overall, there was significant potential for dual-purpose use of winter and winter × spring cultivars in all regions across Australia’s HRZ. Mean simulated potential yields exceeded 4.0 t/ha at most locations, with highest mean simulated grain yields (4.5–5.0 t/ha) in southern Victoria and lower yields (3.3–4.0 t/ha) in central and northern New South Wales. Winter cultivars sown early (March–mid-April) provided most forage (>2000 dry sheep equivalent (DSE) grazing days/ha) at most locations because of the extended vegetative stage linked to the high vernalisation requirement. At locations with Mediterranean climates, the low frequency (<30% of years) of early sowing opportunities before mid-April limited the utility of winter cultivars. Winter × spring cultivars (not yet commercially available), which have an intermediate phenology, had a longer, more reliable sowing window, high grazing potential (up to 1800 DSE-days/ha) and high grain-yield potential. Spring cultivars provided less, but had commercially useful grazing opportunities (300–700 DSE-days/ha) and similar yields to early-sown cultivars. Significant unrealised potential for dual-purpose canola crops of winter × spring and slow spring cultivars was suggested in the south-west of Western Australia, on the Northern Tablelands and Slopes of New South Wales and in southern Queensland. The simulations emphasised the importance of early sowing, adequate N supply and sowing density to maximise grazing potential from dual-purpose crops.



2003 ◽  
Vol 43 (1) ◽  
pp. 71 ◽  
Author(s):  
M. K. Conyers ◽  
C. L. Mullen ◽  
B. J. Scott ◽  
G. J. Poile ◽  
B. D. Braysher

The cost of buying, carting and spreading limestone, relative to the value of broadacre crops, makes investment in liming a questionable proposition for many farmers. The longer the beneficial effects of limestone persist, however, the more the investment in liming becomes economically favourable. We re-established previous lime trials with the aim of measuring the long-term effects of limestone on surface acidity (pH run-down), subsurface acidity (lime movement) and grain yield. The study made use of experiments where there was adequate early data on soil chemical properties and cereal yields. We report data from 6 trials located at 4 sites between Dubbo and Albury in New South Wales. The rate of surface soil (0–10 cm) pH decline after liming was proportional to the pH attained 1 year after liming. That is, the higher the pH achieved, the more rapid the rate of subsequent pH decline. Since yields (product removal) and nitrification (also acid producing) may both vary with pH, the post-liming pH acts as a surrogate for the productivity and acid-generating rate of the soil–plant system. The apparent lime loss rate of the surface soils ranged from the equivalent of nearly 500 kg limestone/ha.year at pH approaching 7, to almost zero at pH approaching 4. At commercial application rates of 2–2.5 t/ha, the movement of alkali below the layer of application was restricted. However, significant calcium (Ca) movement sometimes occurred to below 20 cm depth. At rates of limestone application exceeding the typical commercial rate of 2.5 t/ha, or at surface pH greater than about 5.5, alkali and Ca movement into acidic subsurface soil was clearly observed. It is therefore technically feasible to ameliorate subsurface soil acidity by applying heavy rates of limestone to the soil surface. However, the cost and risks of this option should be weighed against the use of acid-tolerant cultivars in combination with more moderate limestone rates worked into the surface soil.There was a positive residual benefit of limestone on cereal grain yield (either barley, wheat, triticale, or oats) at all sites in both the 1992 and 1993 seasons. While acid-tolerant cultivars were less lime responsive than acid-sensitive ones, the best yields were generally obtained using a combination of liming and acid-tolerant cultivars.The long-term residual benefits of limestone were shown to extend for beyond 8–12 years and indicate that liming should be profitable in the long term.



1969 ◽  
Vol 9 (40) ◽  
pp. 541 ◽  
Author(s):  
PW Grogan ◽  
DS Teakle

Seven out of eight maize inbred lines developed at Lawes in Queensland from open-pollinated varieties were resistant to maize dwarf mosaic disease when exposed to natural infection in the field. Five of the seven resistant inbred lines failed to become systemically infected when inoculated with infectious sap in the glasshouse. By contrast, only three out of twenty lines introduced from the U.S.A., and two out of eight lines developed at the Grafton and Glen Innes Breeding Stations in New South Wales, were resistant in the field. All three resistant lines from the U.S.A. were systemically infected when inoculated in the glasshouse, but the two resistant lines from Grafton in New South Wales were not. The resistant Lawes and Grafton maize inbred lines would appear to be better sources of genes conferring resistance to maize dwarf mosaic disease than the other lines tested.



1967 ◽  
Vol 7 (24) ◽  
pp. 7 ◽  
Author(s):  
P McInnes ◽  
TJ Grainger ◽  
MD Smith

Data are presented on the recovery and reproductive performance of 2 1/2-year-old maiden Merino ewes after a prolonged period of undernutrition. The 217 sheep had been hand-fed on a submaintenance ration in pen feeding trials at Glenfield, New South Wales. During the seven months of the trials they had lost 6 kg (28 to 22 kg) body weight. They were transported to Condobolin in south-western New South Wales, divided into two treatment groups and run on good quality pastures. One group was joined immediately (May 1959) and again ten months later, and the other group was mated after six months at Condobolin (in October 1959) and again 12 months later. The ewes recovered rapidly. The mean weight of both groups had reached 30 kg within six weeks and 40 kg within six months. In the first year 73 of the 100 May-mated ewes bore lambs, but only 38 of these lambs were weaned. Ewes bearing lambs had a higher body weight at the start of joining and gained more during joining than the barren ewes. At the other three joinings (October 1959, May 1960, October 1960) lambing percentage was from 86-89 and weaning percentage from 62-69-both normal for the district. The proportion of twin lambs (3-6 per cent) was low. Wool weight in 1959 was not affected by time of mating or by pregnancy.



1962 ◽  
Vol 2 (6) ◽  
pp. 185 ◽  
Author(s):  
RR Storrier

In a red-brown earth soil from Wagga Wagga the fluctuations in the level of mineral nitrogen (ammonia plus nitrate-nitrogen) and its availability to wheat under growing period rainfalls of 6 inches and 16 inches were studied. Ammonia-nitrogen did not exceed 8 lb nitrogen per acre 6 inches but showed statistically significant short term fluctuations. Mineral nitrogen decreased steadily from the 4-5 leaf stage of plant growth, reaching minimum values in the ear-emergence period when a temporary nitrogen deficiency occurred. Following rainfalls of about one inch or more, conditions favoured biological activity and nitrogen was mineralized, absorbed by the crop and/or leached down the profile. In one season a release of mineral nitrogen about two weeks before flowering contributed an estimated 20-30 per cent of the total nitrogen uptake of the crop. Nitrogen uptake by the wheat crop ceased after flowering and subsequent changes in mineral nitrogen level reflect the net result of mineralization and demineralization processes, and nitrogen uptake by weeds, particularly skeleton weed. Absorption of nitrogen from the profile depended upon seasonal conditions, with the surface 18 inches suppling the greater part of the nitrogen absorbed by the crop. This indicates the need to sample regularly to at least a depth of 18 inches, particularly during the period from 4-5 leaf to flowering, when studying the relation between mineral nitrogen and crop growth. The data suggest that the response of wheat, as measured by grain yield and protein content, to the higher levels of mineral nitrogen in the improved soils of southern New South Wales is determined by soil moisture levels, particularly in the post-flowering period.



1992 ◽  
Vol 32 (4) ◽  
pp. 465 ◽  
Author(s):  
AD Doyle ◽  
RW Kingston

The effect of sowing rate (10-110 kg/ha) on the grain yield of barley (Hordeum vulgare L.) was determined from a total of 20 field experiments conducted in northern New South Wales from 1983 to 1986. Effects of sowing rate on kernel weight and grain protein percentage were also determined from 12 experiments conducted in 1985 and 1986. Two barley varieties were tested each year. In all years fallow plus winter rainfall was equal to or greater than average. Grain yield increased with higher sowing rates in most experiments, with the response curve reaching a plateau above 60-70 kg/ha. For 13 of the 40 variety x year combinations, grain yield fell at the highest sowing rates. Only in an experiment where lodging increased substantially with higher sowing rates was there a reduction in yield at a sowing rate of 60 kg/ha. The average sowing rate for which 5 kg grain was produced per kg of seed sown was 63 kg/ha. Grain protein percentage usually fell, and kernel weight invariably fell, with increasing sowing rate. Increasing sowing rates from the normal commercial rate of 35 kg/ha to a rate of 60 kg/ha typically increased grain yields by 100-400 kg/ha, decreased kernel weight by 0.4-2.0 mg, and decreased grain protein by up to 0.5 percentage points. In no case was the grain weight reduced to below malting specifications. It was concluded that sowing rates for barley in northern New South Wales should be increased to about 60 kg/ha.



Sign in / Sign up

Export Citation Format

Share Document