Efficiency of aerial techniques for long term control of serrated tussock (Nasella trichotoma)

1974 ◽  
Vol 14 (68) ◽  
pp. 405 ◽  
Author(s):  
MH Campbell

In three experiments near Rockley and Gunning, New South Wales, non-arable areas of serrated tussock (Nassella trichotoma) were aerially sprayed with 2,2-DPA in late summer and, in the following autumn, aerially fertilized and sown with pasture species. Measurements of tussock density and the botanical composition of the pastures were made for up to ten years after sowing. The experiments commenced in 1963, 1964 and 1966. About 90 per cent of the tussock plants were killed by applying the herbicide at 16.6 kg a.e. ha-l; higher rates effected little improvement. Aerially sown pastures further reduced tussock numbers and provided long term control. Best results were obtained on fertile soil and where tussock has not been retarded by burning. Of the pasture species sown Trifolium subterraneum and Phalaris tuberosa were the most effective for long term control.

1985 ◽  
Vol 7 (2) ◽  
pp. 80 ◽  
Author(s):  
WE Mulham

Following a sequence of favourable years in which pasture growth over much of the arid zone of Australia reached very high ievels, controlled burns were carried out on two contrasting vegetation types in the extreme north-west of New South Wales. A wheei-point apparatus was used to measure subse- quent changes in botanical composition and foliage cover over a four year period. On a pasture periodically dominated by Mitchell grass (Astrebla spp.) burning while growing conditions were favourable resulted in only a small long- term decrease in the cover of Mitchell grass. In the short-term all chenopod species were eliminated and a wider range and greater abundance of annual forbs were promoted in the following spring. On a similar area burned by wildfire in a year of low summer rainfall the response from Mitchell grass was much poorer and botanical composition of the pasture present in the following spring differed from that which developed in the spring following the controlled burn. It also differed from that of the unburnt pasture. The major differences were due to the response of forb species and are attributed to variation in seasonal rainfall. On a dune-system pasture the dominant grasses were species of Aristida and Enneapogon. These are relatively short-lived and appear to have little ability to regrow from the butt after fire. Their slow regeneration after the burn was reflected in the substantial increase in relative abundance of perennial forbs in the following autumn, and of annual forbs the next spring. Although fire appeared to have no long-term effect on the pasture it dramatically reduced tree and shrub numbers. It is suggested that during years in which abnormal quantities of Mitchell grass are present in this region, controlled burning could be a useful form of management. A mosaic of patches burnt at different times would reduce the potential for wide-scale wildfires, provide refuge areas for stock and wildlife in the event of wildfire, and promote a wider choice of plant material for grazing animals. However, in dune-systems vegetation, removal of the pasture cover and reduction of the tree and shrub density would constitute an erosion risk.


1975 ◽  
Vol 26 (2) ◽  
pp. 269 ◽  
Author(s):  
A Lazenby ◽  
JV Lovett

The production of herbage by five pasture species—Phalaris tuberosa (phalaris), Festuca arundinacea (tall fescue), Lolium perenne (perennial ryegrass), Trifolium repens (white clover) and Medicago sativa (lucerne)—was measured when they were grown in the field in monoculture, and by phalaris and white clover when grown in mixture. The plots were irrigated to prevent water deficits, and five levels of nitrogen were included; the mixture was also grown under dryland conditions. All plots were defoliated at intervals during a period of 3 years. A capacitance probe was used in an attempt to determine harvest times more objectively, and to establish long-term relationships between meter readings and components of plant yield. Major differences in production were detected between the species, lucerne producing most in the first 2 years of the experiment. Nitrogen and available soil moisture affected both production and botanical composition, and significant differences were detected in species' responses to applied nitrogen and in nitrogen recovery. The performance of lucerne and tall fescue suggests that both species deserve to be more widely grown on the Northern Tablelands of New South Wales.


2007 ◽  
Vol 47 (5) ◽  
pp. 563 ◽  
Author(s):  
G. M. Lodge ◽  
S. Harden

Two studies to evaluate annual pasture legumes were sown in replicated plots near Tamworth, New South Wales. In the first (experiment 1), 24 entries were sown in 1995 and in a second study (experiment 2) 33 entries were sown in 1996. Green herbage mass (kg DM/ha) was assessed in the year of sowing (spring) and thereafter four times per year until spring 2000. Limited data were also collected to estimate maturity grading, seed yield and seedling regeneration. For each experiment, green herbage mass data were examined using cubic smoothing splines and at the end of each study, green herbage mass values predicted from the model were used to assess the significance (P = 0.05) of differences between cultivars or lines. In spring 2000 (experiment 1), Trifolium subterraneum var. brachycalycinum cv. Clare had the highest rank of the cultivars and lines, and T. michelianum cv. Paradana the lowest (previously cultivated site). For the native pasture site, CPI 70056B subterranean clover had the highest rank and Ornithopus compressus cv. Paros the lowest. In experiment 2, Clare had the highest rank in spring 2000 and T. resupinatum cv. Bolta had the lowest ranking. Long-term green herbage mass appeared to be strongly influenced by maturity grading, but other factors may have affected the performance of annual Medicago spp., O. compressus, T. resupinatum, and T. michelianum. Results from the current study and previous reported research indicated that T. subterraneum var. subterraneum cvv. York (evaluated as CPI 89846B) and Junee and T. subterraneum var. brachycalycinum cv. Clare performed best in northern New South Wales.


2000 ◽  
Vol 22 (1) ◽  
pp. 44 ◽  
Author(s):  
SJ Holdaway ◽  
PC Fanning ◽  
DC Witter

Recent erosion in arid regions of western NSW has exposed large areas that are scattered with stone artefacts manufactured by Aboriginal people in prehistory. These exposures offer an opportunity for archaeologists to study the artefacts abandoned by Aboriginal people through time and to compare those artefacts that accumulate in different parts of the landscape. To reconstruct the nature of prehistoric behaviour in the rangelands, two approaches are needed. First, the geomorphological context of the artefacts needs to be considered since exposure of the artefacts is a function of landscape history. Second, large areas (measured in thousands of square metres) and large numbers of artefacts need to be considered if patterns reflecting long-term abandonment behaviour by Aboriginal people are to be identified. This paper reports on the Western New South Wales Archaeological Program (WNSWAP) which was initiated in 1995 to study surface archaeology in the rangelands. Geomorphological studies are combined with artefact analysis using geographic information system software to investigate Aboriginal stone artefact scatters and associated features such as heat retainer hearths, in a landscape context. Results suggest that apparently random scatters of stone artefacts are in fact patterned in ways which inform on prehistoric Aboriginal settlement of the rangelands. Key words: Aboriginal stone artefacts; rangelands; landscape archaeology; geomorphology; GIs


2021 ◽  
Author(s):  
Christopher Dowling ◽  
Anthony Morgan

The criminal mobility of outlaw motorcycle gang (OMCG) members presents a significant challenge to Australian governments and police. Examining patterns of mobility can help to better understand the opportunity structures that underpin offending by OMCGs and to drive national collaborative responses to these gangs. This study examines the prevalence and patterns of criminal mobility in a sample of almost 4,000 OMCG members in more than 400 chapters. Around one in 10 members showed evidence of criminal mobility over the long term, while more than one-third of chapters comprised criminally mobile members. Criminally mobile gang members were heavily concentrated in a small number of chapters. Patterns of criminal mobility primarily involve movements into east coast jurisdictions. New South Wales and Queensland emerged as the most common destinations for criminally mobile OMCG members.


Author(s):  
Craig Tibbitts

This chapter highlights the long-term influence of Scottish military traditions and identity in Australia, dating back to the arrival of a battalion of the 73rd Highland Regiment in New South Wales in 1810. From the 1860s, several home-grown ‘Scottish’ volunteer militia units were established in the Australian colonies. This coincided with a peak period of Scottish emigration to Australia with some 265,000 settling between 1850 and 1914. With the outbreak of the First World War, Australia quickly raised a contingent to assist the Empire. Several Scottish-Australian militia regiments sought incorporation into the Australian Imperial Force (AIF) but with limited success. This chapter highlights how the existence of Scottish military identities conflicted with the desire of the AIF that its identity be entirely Australian as means of forging the identity of the new Commonwealth of Australia. At the same time, a small number of AIF units managed to maintain some small degree of Scottish flavour about them. Those such as the 4th, 5th and 56th Battalions which had many join en- masse from the pre-war ‘Scottish’ militia regiments, provide examples of how this identity survived and was influenced by some key officers and NCOs of Scots heritage.


1994 ◽  
Vol 34 (1) ◽  
pp. 33 ◽  
Author(s):  
GM Lodge

Burrs were collected from paddocks on 3 properties in northern New South Wales where the age of the Trifolium subterraneum var. brachycalycinum cv. Clare swards varied from 19 to 28 years. At 1 site burrs were also sampled from swards sown 2 and 10 years previously. Twenty seedlings from these burrs and 20 plants of certified cv. Clare were grown as spaced plants in a nursery. These were assessed for vegetative and floral characters, flowering time, number of seeds per burr, seed weight, and percentage hardseed after storage at 25/25�C for 6 months and 25/45�C for a further 6 months. For most plants the mean number of days from sowing to first flower was similar to that of Clare. Compared with the naturalised strains, Clare had the lowest (P<0.05) mean number of seeds per burr: about 25% below the mean of the strains (2.7 seeds per burr). While the lowest mean seed weights of the strains were not significantly different from those of Clare, the seed weights of plants from 3 sites were higher (P<0.05) than those of Clare. After storage for either 6 or 12 months, hardseed levels were also lowest (P<0.05) for Clare. Plants from the 2-year-old sward had the same median number of seeds per burr (2.0) as Clare. As sward age increased, the median number of seeds per burr increased to 2.8. Hardseed percentages were lowest for plants of Clare and for those from the 2-year-old sward after 6 months, and for Clare after 12 months. These studies indicated the presence of divergent strains in old swards of Clare in a summer rainfall environment. Natural selection among variability within Clare is the most likely reason for the development of these strains in an environment marginal for the long-term persistence of this softseeded cultivar. Although strains had the same vegetative and floral markings as Clare, differences in ecologically important characters such as number of seeds per burr, seed weight, and hardseededness may result in plants that are better adapted to the environment in which they evolved. From these studies 23 plants of T. subterraneum var. brachycalycinum were selected for further evaluation.


2003 ◽  
Vol 43 (1) ◽  
pp. 71 ◽  
Author(s):  
M. K. Conyers ◽  
C. L. Mullen ◽  
B. J. Scott ◽  
G. J. Poile ◽  
B. D. Braysher

The cost of buying, carting and spreading limestone, relative to the value of broadacre crops, makes investment in liming a questionable proposition for many farmers. The longer the beneficial effects of limestone persist, however, the more the investment in liming becomes economically favourable. We re-established previous lime trials with the aim of measuring the long-term effects of limestone on surface acidity (pH run-down), subsurface acidity (lime movement) and grain yield. The study made use of experiments where there was adequate early data on soil chemical properties and cereal yields. We report data from 6 trials located at 4 sites between Dubbo and Albury in New South Wales. The rate of surface soil (0–10 cm) pH decline after liming was proportional to the pH attained 1 year after liming. That is, the higher the pH achieved, the more rapid the rate of subsequent pH decline. Since yields (product removal) and nitrification (also acid producing) may both vary with pH, the post-liming pH acts as a surrogate for the productivity and acid-generating rate of the soil–plant system. The apparent lime loss rate of the surface soils ranged from the equivalent of nearly 500 kg limestone/ha.year at pH approaching 7, to almost zero at pH approaching 4. At commercial application rates of 2–2.5 t/ha, the movement of alkali below the layer of application was restricted. However, significant calcium (Ca) movement sometimes occurred to below 20 cm depth. At rates of limestone application exceeding the typical commercial rate of 2.5 t/ha, or at surface pH greater than about 5.5, alkali and Ca movement into acidic subsurface soil was clearly observed. It is therefore technically feasible to ameliorate subsurface soil acidity by applying heavy rates of limestone to the soil surface. However, the cost and risks of this option should be weighed against the use of acid-tolerant cultivars in combination with more moderate limestone rates worked into the surface soil.There was a positive residual benefit of limestone on cereal grain yield (either barley, wheat, triticale, or oats) at all sites in both the 1992 and 1993 seasons. While acid-tolerant cultivars were less lime responsive than acid-sensitive ones, the best yields were generally obtained using a combination of liming and acid-tolerant cultivars.The long-term residual benefits of limestone were shown to extend for beyond 8–12 years and indicate that liming should be profitable in the long term.


2001 ◽  
Vol 41 (2) ◽  
pp. 187 ◽  
Author(s):  
R. Aldaoud ◽  
W. Guppy ◽  
L. Callinan ◽  
S. F. Flett ◽  
K. A. Wratten ◽  
...  

In 1995–96, a survey of soil samples from subterranean clover (Trifolium subterraneum L.) paddocks was conducted across Victoria, South Australia, New South Wales and Western Australia, to determine the distribution and the prevalence of races of Phytophthora clandestina (as determined by the development of root rot on differential cultivars), and the association of its occurrence with paddock variables. In all states, there was a weak but significant association between P. clandestina detected in soil samples and subsequent root rot susceptibility of differential cultivars grown in these soil samples. Phytophthora clandestina was found in 38% of the sampled sites, with a significantly lower prevalence in South Australia (27%). There were significant positive associations between P. clandestina detection and increased soil salinity (Western Australia), early growth stages of subterranean clover (Victoria), mature subterranean clover (South Australia), recently sown subterranean clover (South Australia), paddocks with higher subterranean clover content (Victoria), where herbicides were not applied (South Australia), irrigation (New South Wales and Victoria), cattle grazing (South Australia and Victoria), early sampling dates (Victoria and New South Wales), sampling shortly after the autumn break or first irrigation (Victoria), shorter soil storage time (Victoria) and farmer’s perception of root rot being present (Victoria and New South Wales). Only 29% of P. clandestina isolates could be classified under the 5 known races. Some of the unknown races were virulent on cv. Seaton Park LF (most resistant) and others were avirulent on cv. Woogenellup (most susceptible). Race 1 was significantly less prevalent in South Australia than Victoria and race 0 was significantly less prevalent in New South Wales than in South Australia and Western Australia. This study revealed extremely wide variation in the virulence of P. clandestina. The potential importance of the results on programs to breed for resistance to root rot are discussed. in South Australia.


Sign in / Sign up

Export Citation Format

Share Document