A comparison of three fallow management strategies for the long-term productivity of wheat in northern New South Wales

1995 ◽  
Vol 35 (7) ◽  
pp. 915 ◽  
Author(s):  
WL Felton ◽  
H Marcellos ◽  
RJ Martin

Four experiments were commenced after a 1980 wheat crop, and a fifth after the 1981 crop, at different sites representing the major soil types of northern New South Wales in the 550-700 mm rainfall zone, to examine the influence of 3 fallow management practices [no tillage (NT); stubble retention after harvest, cultivation (SM); stubble burning after harvest, cultivation (SB)] on wheat production. Data considered in this paper cover the continuous wheat subtreatments of the 5 experiments (1981-90). Nitrogen applied at 50 kg Nlha in addition to the basal treatment was included as a treatment from 1986 to 1988. Across all sites and seasons, grain yields were in the order SB>SM = NT, stubble retention having a greater effect than tillage. In some years at some sites, differences in grain yield and grain N yield were not significant. In others, when significant yield differences occurred, variations in grain yield and grain N yield were highly correlated with differences in soil N available for the crop. The data show that the influence of fallow management interacted with season and crop nutrition, and required long-term study for proper assessment.

2003 ◽  
Vol 43 (1) ◽  
pp. 71 ◽  
Author(s):  
M. K. Conyers ◽  
C. L. Mullen ◽  
B. J. Scott ◽  
G. J. Poile ◽  
B. D. Braysher

The cost of buying, carting and spreading limestone, relative to the value of broadacre crops, makes investment in liming a questionable proposition for many farmers. The longer the beneficial effects of limestone persist, however, the more the investment in liming becomes economically favourable. We re-established previous lime trials with the aim of measuring the long-term effects of limestone on surface acidity (pH run-down), subsurface acidity (lime movement) and grain yield. The study made use of experiments where there was adequate early data on soil chemical properties and cereal yields. We report data from 6 trials located at 4 sites between Dubbo and Albury in New South Wales. The rate of surface soil (0–10 cm) pH decline after liming was proportional to the pH attained 1 year after liming. That is, the higher the pH achieved, the more rapid the rate of subsequent pH decline. Since yields (product removal) and nitrification (also acid producing) may both vary with pH, the post-liming pH acts as a surrogate for the productivity and acid-generating rate of the soil–plant system. The apparent lime loss rate of the surface soils ranged from the equivalent of nearly 500 kg limestone/ha.year at pH approaching 7, to almost zero at pH approaching 4. At commercial application rates of 2–2.5 t/ha, the movement of alkali below the layer of application was restricted. However, significant calcium (Ca) movement sometimes occurred to below 20 cm depth. At rates of limestone application exceeding the typical commercial rate of 2.5 t/ha, or at surface pH greater than about 5.5, alkali and Ca movement into acidic subsurface soil was clearly observed. It is therefore technically feasible to ameliorate subsurface soil acidity by applying heavy rates of limestone to the soil surface. However, the cost and risks of this option should be weighed against the use of acid-tolerant cultivars in combination with more moderate limestone rates worked into the surface soil.There was a positive residual benefit of limestone on cereal grain yield (either barley, wheat, triticale, or oats) at all sites in both the 1992 and 1993 seasons. While acid-tolerant cultivars were less lime responsive than acid-sensitive ones, the best yields were generally obtained using a combination of liming and acid-tolerant cultivars.The long-term residual benefits of limestone were shown to extend for beyond 8–12 years and indicate that liming should be profitable in the long term.


2002 ◽  
Vol 42 (8) ◽  
pp. 1087 ◽  
Author(s):  
C. R. Kidd ◽  
G. M. Murray ◽  
J. E. Pratley ◽  
A. R. Leys

Winter cleaning is the removal of grasses from pasture using selective herbicides applied during winter. We compared the effectiveness of an early (June) and late (July) winter cleaning with an early spring herbicide fallow (September), spring (October) herbicide and no disturbance of the pasture on development of the root disease take-all in the subsequent wheat crop. Experiments were done at 5 sites in the eastern Riverina of New South Wales in 1990 and 1991. The winter clean treatments reduced soil inoculum of Gaeumannomyces graminis var. tritici (Ggt) compared with the other treatments at all sites as measured by a bioassay, with reductions from the undisturbed treatments of 52–79% over 5 sites. The winter clean treatments also significantly reduced the amount of take-all that developed in the subsequent wheat crop by between 52 and 83%. The early and late winter clean treatments increased the number of heads/m2 at 3 and 1 sites, respectively. Dry matter at anthesis was increased by the winter clean treatments at 3 sites. Grain yield was increased by the winter cleaning treatments over the other treatments at the 4 sites harvested, with yield increases of the early winter clean over the undisturbed treatment from 13 to 56%. The autumn bioassay of Ggt was positively correlated with spring take-all and negatively correlated with grain yield of the subsequent wheat crop at each site. However, there was a significant site and site × bioassay interaction so that the autumn bioassay could not be used to predict the amount of take-all that would develop.


1986 ◽  
Vol 26 (6) ◽  
pp. 709 ◽  
Author(s):  
AC Taylor ◽  
WJ Lill

Regular hand-weeding was undertaken in experiments located in 167 wheat crops in southern New South Wales from 1967 to 1970 to quantify the effect of weeds on 10 wheat attributes at flowering or maturity. Short annual grasses, skeleton weed, wild oats and annual legumes were the most widespread weeds, all of which tended to occur in mixed stands. At wheat flowering, over all sites, wheat DM, nitrogen concentration, nitrogen uptake, phosphorus uptake and number of ears were increased (P< 0.05) by 11.2, 3.3, 14.4, 13.6 and 7.8%, respectively by weeding; wheat phosphorus concentrations did not respond to weeding. At maturity, grain yield and nitrogen yield increased after weeding (P< 0.05) by 17.3 and 1 7.0%, respectively, but grain protein and kernel weight did not respond to weeding. Regression procedures were used to relate wheat responses to total weed DM and the DM of 8 weed classes. At flowering, for every 100 g of DM removed, wheat DM, nitrogen uptake, phosphorus uptake and ear number increased by 52.3 g m-2, 958 mg m-2, 92.6 mg m-2and 18.7 m-2, respectively. At maturity, grain yield and grain nitrogen yield increased by 31.9 g m-2 and 665 mg m-2, respectively, for every 100g m-2 of weed DM present at flowering. The regressions also showed that, at both flowering and maturity, fumitory, annual grasses and sundry weeds (a group made up of weeds not sufficiently widespread to consider separately) appeared to be the most aggressive weeds. Consideration of standardised responses of the wheat attributes increased by weeding showed that they all responded similarly when corrected for scale of measurement.


2000 ◽  
Vol 22 (1) ◽  
pp. 44 ◽  
Author(s):  
SJ Holdaway ◽  
PC Fanning ◽  
DC Witter

Recent erosion in arid regions of western NSW has exposed large areas that are scattered with stone artefacts manufactured by Aboriginal people in prehistory. These exposures offer an opportunity for archaeologists to study the artefacts abandoned by Aboriginal people through time and to compare those artefacts that accumulate in different parts of the landscape. To reconstruct the nature of prehistoric behaviour in the rangelands, two approaches are needed. First, the geomorphological context of the artefacts needs to be considered since exposure of the artefacts is a function of landscape history. Second, large areas (measured in thousands of square metres) and large numbers of artefacts need to be considered if patterns reflecting long-term abandonment behaviour by Aboriginal people are to be identified. This paper reports on the Western New South Wales Archaeological Program (WNSWAP) which was initiated in 1995 to study surface archaeology in the rangelands. Geomorphological studies are combined with artefact analysis using geographic information system software to investigate Aboriginal stone artefact scatters and associated features such as heat retainer hearths, in a landscape context. Results suggest that apparently random scatters of stone artefacts are in fact patterned in ways which inform on prehistoric Aboriginal settlement of the rangelands. Key words: Aboriginal stone artefacts; rangelands; landscape archaeology; geomorphology; GIs


2015 ◽  
Vol 66 (4) ◽  
pp. 349 ◽  
Author(s):  
Julianne M. Lilley ◽  
Lindsay W. Bell ◽  
John A. Kirkegaard

Recent expansion of cropping into Australia’s high-rainfall zone (HRZ) has involved dual-purpose crops suited to long growing seasons that produce both forage and grain. Early adoption of dual-purpose cropping involved cereals; however, dual-purpose canola (Brassica napus) can provide grazing and grain and a break crop for cereals and grass-based pastures. Grain yield and grazing potential of canola (up until bud-visible stage) were simulated, using APSIM, for four canola cultivars at 13 locations across Australia’s HRZ over 50 years. The influence of sowing date (2-weekly sowing dates from early March to late June), nitrogen (N) availability at sowing (50, 150 and 250 kg N/ha), and crop density (20, 40, 60, 80 plants/m2) on forage and grain production was explored in a factorial combination with the four canola cultivars. The cultivars represented winter, winter × spring intermediate, slow spring, and fast spring cultivars, which differed in response to vernalisation and photoperiod. Overall, there was significant potential for dual-purpose use of winter and winter × spring cultivars in all regions across Australia’s HRZ. Mean simulated potential yields exceeded 4.0 t/ha at most locations, with highest mean simulated grain yields (4.5–5.0 t/ha) in southern Victoria and lower yields (3.3–4.0 t/ha) in central and northern New South Wales. Winter cultivars sown early (March–mid-April) provided most forage (>2000 dry sheep equivalent (DSE) grazing days/ha) at most locations because of the extended vegetative stage linked to the high vernalisation requirement. At locations with Mediterranean climates, the low frequency (<30% of years) of early sowing opportunities before mid-April limited the utility of winter cultivars. Winter × spring cultivars (not yet commercially available), which have an intermediate phenology, had a longer, more reliable sowing window, high grazing potential (up to 1800 DSE-days/ha) and high grain-yield potential. Spring cultivars provided less, but had commercially useful grazing opportunities (300–700 DSE-days/ha) and similar yields to early-sown cultivars. Significant unrealised potential for dual-purpose canola crops of winter × spring and slow spring cultivars was suggested in the south-west of Western Australia, on the Northern Tablelands and Slopes of New South Wales and in southern Queensland. The simulations emphasised the importance of early sowing, adequate N supply and sowing density to maximise grazing potential from dual-purpose crops.


2021 ◽  
Author(s):  
Christopher Dowling ◽  
Anthony Morgan

The criminal mobility of outlaw motorcycle gang (OMCG) members presents a significant challenge to Australian governments and police. Examining patterns of mobility can help to better understand the opportunity structures that underpin offending by OMCGs and to drive national collaborative responses to these gangs. This study examines the prevalence and patterns of criminal mobility in a sample of almost 4,000 OMCG members in more than 400 chapters. Around one in 10 members showed evidence of criminal mobility over the long term, while more than one-third of chapters comprised criminally mobile members. Criminally mobile gang members were heavily concentrated in a small number of chapters. Patterns of criminal mobility primarily involve movements into east coast jurisdictions. New South Wales and Queensland emerged as the most common destinations for criminally mobile OMCG members.


Author(s):  
Craig Tibbitts

This chapter highlights the long-term influence of Scottish military traditions and identity in Australia, dating back to the arrival of a battalion of the 73rd Highland Regiment in New South Wales in 1810. From the 1860s, several home-grown ‘Scottish’ volunteer militia units were established in the Australian colonies. This coincided with a peak period of Scottish emigration to Australia with some 265,000 settling between 1850 and 1914. With the outbreak of the First World War, Australia quickly raised a contingent to assist the Empire. Several Scottish-Australian militia regiments sought incorporation into the Australian Imperial Force (AIF) but with limited success. This chapter highlights how the existence of Scottish military identities conflicted with the desire of the AIF that its identity be entirely Australian as means of forging the identity of the new Commonwealth of Australia. At the same time, a small number of AIF units managed to maintain some small degree of Scottish flavour about them. Those such as the 4th, 5th and 56th Battalions which had many join en- masse from the pre-war ‘Scottish’ militia regiments, provide examples of how this identity survived and was influenced by some key officers and NCOs of Scots heritage.


1962 ◽  
Vol 2 (6) ◽  
pp. 185 ◽  
Author(s):  
RR Storrier

In a red-brown earth soil from Wagga Wagga the fluctuations in the level of mineral nitrogen (ammonia plus nitrate-nitrogen) and its availability to wheat under growing period rainfalls of 6 inches and 16 inches were studied. Ammonia-nitrogen did not exceed 8 lb nitrogen per acre 6 inches but showed statistically significant short term fluctuations. Mineral nitrogen decreased steadily from the 4-5 leaf stage of plant growth, reaching minimum values in the ear-emergence period when a temporary nitrogen deficiency occurred. Following rainfalls of about one inch or more, conditions favoured biological activity and nitrogen was mineralized, absorbed by the crop and/or leached down the profile. In one season a release of mineral nitrogen about two weeks before flowering contributed an estimated 20-30 per cent of the total nitrogen uptake of the crop. Nitrogen uptake by the wheat crop ceased after flowering and subsequent changes in mineral nitrogen level reflect the net result of mineralization and demineralization processes, and nitrogen uptake by weeds, particularly skeleton weed. Absorption of nitrogen from the profile depended upon seasonal conditions, with the surface 18 inches suppling the greater part of the nitrogen absorbed by the crop. This indicates the need to sample regularly to at least a depth of 18 inches, particularly during the period from 4-5 leaf to flowering, when studying the relation between mineral nitrogen and crop growth. The data suggest that the response of wheat, as measured by grain yield and protein content, to the higher levels of mineral nitrogen in the improved soils of southern New South Wales is determined by soil moisture levels, particularly in the post-flowering period.


Sign in / Sign up

Export Citation Format

Share Document