Land management within capability: a new scheme to guide sustainable land management in New South Wales, Australia

Soil Research ◽  
2015 ◽  
Vol 53 (6) ◽  
pp. 683 ◽  
Author(s):  
Jonathan M. Gray ◽  
Greg A. Chapman ◽  
Brian W. Murphy

A new evaluation scheme, land management within capability (LMwC), used to guide sustainable land management in New South Wales (NSW), is presented. The scheme semi-quantitatively categorises the potential impacts of specific land-management actions and compares these with the inherent physical capability of the land in relation to a range of land-degradation hazards. This leads to the derivation of LMwC indices, which signify the sustainability of land-management practices at the scale of individual sites up to broader regions. The LMwC scheme can be used to identify lands at greatest risk from various land-degradation hazards. It can help to guide natural resource agencies at local, regional and state levels to target priorities and promote sustainable land management across their lands. Few other schemes that assess the sustainability of a given land-management regime in a semi-quantitative yet pragmatic manner are found in the literature. The scheme has particular application for regional soil-monitoring programs and it was applied in such a program over NSW in 2008–09. The results suggested that the hazards most poorly managed across the state are wind erosion, soil acidification and soil organic carbon decline. The LMwC scheme, or at least its underlying concepts, could be readily applied to other jurisdictions.

CATENA ◽  
2022 ◽  
Vol 211 ◽  
pp. 105956
Author(s):  
Xihua Yang ◽  
John Leys ◽  
Jonathan Gray ◽  
Mingxi Zhang

Soil Research ◽  
2010 ◽  
Vol 48 (3) ◽  
pp. 248 ◽  
Author(s):  
Matthew Miklos ◽  
Michael G. Short ◽  
Alex B. McBratney ◽  
Budiman Minasny

The reliable assessment of soil carbon stock is of key importance for soil conservation and mitigation strategies related to reducing atmospheric carbon. Measuring and monitoring soil carbon is complex because carbon pools cycle and rates of carbon sequestration vary across the landscape due to climate, soil type, and management practices. A new methodology has been developed and applied to make an assessment of the distribution of total, organic, and inorganic carbon at a grains research and grazing property in northern New South Wales at a high spatial resolution. In this study, baseline soil carbon maps were created using fine resolution, geo-referenced, proximal sensor data. Coupled with a digital elevation model and secondary terrain attributes, all of the data layers were combined by k-means clustering to develop a stratified random soil sampling scheme for the survey area. Soil samples taken at 0.15-m increments to a depth of 1 m were scanned with a mid-infrared spectrometer, which was calibrated using a proportion of the samples that were analysed in a laboratory for total carbon and inorganic carbon content. This combination of new methodologies and technologies has the potential to provide large volumes of reliable, fine resolution and timely data required to make baseline assessments, mapping, monitoring, and verification possible. This method has the potential to make soil carbon management and trading at the farm-scale possible by quantifying the carbon stock to a depth of 1 m and at a high spatial resolution.


Soil Research ◽  
2009 ◽  
Vol 47 (3) ◽  
pp. 340
Author(s):  
B. Kelly ◽  
C. Allan ◽  
B. P. Wilson

'Soil health' programs and projects in Australia's agricultural districts are designed to influence farmers' management behaviours, usually to produce better outcomes for production, conservation, and sustainability. These programs usually examine soil management practices from a soil science perspective, but how soils are understood by farmers, and how that understanding informs their farm management decisions, is poorly documented. The research presented in this paper sought to better understand how dryland farmers in the Billabong catchment of southern New South Wales use soil indicators to inform their management decisions. Thematic content analysis of transcripts of semi-structured, face-to-face interviews with farmers suggest several themes that have implications for soil scientists and other professionals wishing to promote soil health in the dryland farming regions of south-eastern Australia. In particular, all soil indicators, including those related to soil 'health', need to relate to some clear, practical use to farmers if they are to be used in farm decision making. This research highlights a reliance of the participants of this research on agronomists. Reliance on agronomists for soil management decisions may result in increasing loss of connectivity between farmers and their land. If this reflects a wider trend, soil health projects may need to consider where best to direct their capacity-building activities, and/or how to re-empower individual farmers.


1994 ◽  
Vol 34 (7) ◽  
pp. 921 ◽  
Author(s):  
DC Godwin ◽  
WS Meyer ◽  
U Singh

Evidence exists that night temperatures <18�C immediately preceding flowering in rice crops can adversely affect floret fertility and, hence, yields. It has been suggested that sterility induced by low temperature is also influenced by floodwater depth and nitrogen (N) rate. In southern New South Wales, low night-time temperatures are believed to be a major constraint to the achievement of consistently high yields. The availability of a comprehensive model of rice growth and yield that is sensitive to this constraint would aid the development of better management practices. CERES RICE is a comprehensive model that simulates the phasic development of a rice crop, the growth of its leaves, stems, roots, and panicles, and their response to weather. It also simulates the water and N balances of the crop and the effects of stresses of water and N on the yield-forming processes. The model has been extensively tested in many rice-growing systems in both tropical and temperate environments. However, the original model was unable to simulate the level of chilling injury evident from yield data from southern New South Wales. This paper reports modifications made in the model to simulate these effects and the evaluation of the model in environments of low night temperature. Inclusion of the chilling injury effect greatly improved the accuracy of estimated yields from treatments in an extensive field experiment. However, additional testing with a wider range of data sets is needed to confirm the international applicability of the modifications.


Soil Research ◽  
2009 ◽  
Vol 47 (8) ◽  
pp. 781 ◽  
Author(s):  
Brian Wilson ◽  
Subhadip Ghosh ◽  
Phoebe Barnes ◽  
Paul Kristiansen

There is a widespread and growing need for information relating to soil condition and changes in response to land management pressures. To provide the information needed to quantify land management effects on soil condition, monitoring systems are now being put in place and these programs will generate large numbers of samples. Streamlined procedures for the analysis of large sample numbers are therefore required. Bulk density (BD) is considered to be one of several key indicators for measuring soil physical condition, and is also required to estimate soil carbon density. The standard analytical technique for BD requires drying the soil at 105°C but this procedure creates several logistical and analytical problems. Our initial objective was to derive correction factors between drying temperatures to allow for rapid, low-temperature estimation of BD on large sample numbers. Soil samples were collected from 3 contrasting soil types (basalt, granite, and meta-sediments) in 4 land uses (cultivation, sown pasture, native pasture, woodland) in northern New South Wales to test the effect of soil drying temperature on BD determination. Cores were divided into 4 depths (0–0.05, 0.050–0.10, 0.10–0.20, 0.20–0.30 m), and oven-dried at 40, 70, and 105°C. Drying temperature had no significant effect on BD but the effects of soil type, depth, and land use were significant, varying according to expectations based on previous studies, i.e. higher BD in granite-derived soils and lower in basalt-derived soils, increased BD with depth, and increasing BD with increasing management intensity. These results indicate that lower drying temperatures (40°C) were adequate for the efficient determination of BD especially where analysis of other soil properties from the same sample is required. However, before this approach is applied more widely, further calibration of BD and drying temperature should be undertaken across a wider range of soils, especially on clay-rich soils.


2012 ◽  
Vol 52 (7) ◽  
pp. 675 ◽  
Author(s):  
R. D. Bush ◽  
R. Barnett ◽  
I. J. Links ◽  
P. A. Windsor

The prevalence of Caseous lymphadenitis (CLA) in Australia was estimated to be 5.2% using 2009 abattoir surveillance data from all States supplied by Animal Health Australia involving 5029 lines comprising 1 339 463 sheep. This is a decrease from the 26% estimated in a similar study in 1995. There was a significant difference (P < 0.001) in CLA prevalence between all states except Tasmania and Victoria (P = 0.75) with prevalences of 12.8 and 12.9%, respectively. Western Australia recorded the lowest prevalence with 1.0%. The average CLA prevalence for New South Wales was 5.3% and within three surveyed Livestock Health and Pest Authority regions (Tablelands, Central North and Central West) was 2.9, 4.9 and 4.4%, respectively. The attitude of the majority of producers surveyed in these three Livestock Health and Pest Authority areas was that CLA was of little or no significance (75%) but were aware of the need for CLA control with ~68% using 6-in-1 vaccine, though only 39.9% as recommended. It appears that the prolonged use of CLA vaccination has been successful in reducing the prevalence of CLA across Australia and particularly in New South Wales. Further improvements in communication of information on preventative management practices associated with lice control, importance of using an approved vaccination program, plus increasing producers’ awareness of the importance of CLA control, are indicated.


2001 ◽  
Vol 23 (2) ◽  
pp. 194 ◽  
Author(s):  
Andy Sharp ◽  
Kerry Holmes ◽  
Melinda Norton ◽  
Adam Marks

Between winter 1995 and winter 1998, seasonal spotlight counts for rabbits were conducted along three transects in western New South Wales. Rabbit Calicivirus (RCV) arrived at the study site in spring 1996 and had an immediate marked effect on rabbit densities. Prior to the advent of Rabbit Calicivirus Disease (RCD), rabbit abundance followed the expected annual pattern of positive growth during the winter to summer period and negligible or negative growth during the summer to winter period. With the arrival of RCV, rabbit abundance was observed to decline by 47% and 75% within low density populations and by 84% within a medium density population. In the subsequent 21 months, the low density populations returned to levels approximating those prior to the arrival of RCV. In contrast, rabbit abundance within the medium density population remained at consistently lowered levels. These data suggest that RCD will have a minimal effect on semi-arid zone rabbit populations below a density of 0.4/ha and that additional management actions will be required to further reduce rabbit abundances.


Sign in / Sign up

Export Citation Format

Share Document