Cradle-to-farmgate greenhouse gas emissions for 2-year wheat monoculture and break crop–wheat sequences in south-eastern Australia

2016 ◽  
Vol 67 (8) ◽  
pp. 812 ◽  
Author(s):  
Philippa M. Brock ◽  
Sally Muir ◽  
David F. Herridge ◽  
Aaron Simmons

We used life cycle assessment methodology to determine the cradle-to-farmgate GHG emissions for rainfed wheat grown in monoculture or in sequence with the break crops canola (Brassica napus) and field peas (Pisum sativum), and for the break crops, in the south-eastern grains region of Australia. Total GHG emissions were 225 kg carbon dioxide equivalents (CO2-e)/t grain for a 3 t/ha wheat crop following wheat, compared with 199 and 172 kg CO2-e/t for wheat following canola and field peas, respectively. On an area basis, calculated emissions were 676, 677 and 586 kg CO2-e/ha for wheat following wheat, canola and field peas, respectively. Highest emissions were associated with the production and transport of fertilisers (23–28% of total GHG emissions) and their use in the field (16–23% of total GHG emissions). Production, transport and use of lime accounted for an additional 19–21% of total GHG emissions. The lower emissions for wheat after break crops were associated with higher yields, improved use of fertiliser nitrogen (N) and reduced fertiliser N inputs in the case of wheat after field peas. Emissions of GHG for the production and harvesting of canola were calculated at 841 kg CO2-e/ha, equivalent to 420 kg CO2-e/t grain. Those of field peas were 530 kg CO2-e/ha, equivalent to 294 kg CO2-e/t grain. When the gross margin returns for the crops were considered together with their GHG emissions, the field pea–wheat sequence had the highest value per unit emissions, at AU$787/t CO2-e, followed by wheat–wheat ($703/t CO2-e) and canola–wheat ($696/t CO2-e). Uncertainties associated with emissions factor values for fertiliser N, legume-fixed N and mineralised soil organic matter N are discussed, together with the potentially high C cost of legume N2 fixation and the impact of relatively small changes in soil C during grain cropping either to offset all or most pre- and on-farm GHG emissions or to add to them.

2012 ◽  
Vol 63 (7) ◽  
pp. 593 ◽  
Author(s):  
J. G. Nuttall ◽  
G. J. O'Leary ◽  
N. Khimashia ◽  
S. Asseng ◽  
G. Fitzgerald ◽  
...  

Under a future climate for south-eastern Australia there is the likelihood that the net effect of elevated CO2, (eCO2) lower growing-season rainfall and high temperature will increase haying-off thus limit production of rain-fed wheat crops. We used a modelling approach to assess the impact of an expected future climate on wheat growth across four cropping regions in Victoria. A wheat model, APSIM-Nwheat, was performance tested against three datasets: (i) a field experiment at Wagga Wagga, NSW; (ii) the Australian Grains Free Air Carbon dioxide Enrichment (AGFACE) experiment at Horsham, Victoria; and (iii) a broad-acre wheat crop survey in western Victoria. For down-scaled climate predictions for 2050, average rainfall during October, which coincides with crop flowering, decreased by 32, 29, 26, and 18% for the semiarid regions of the northern Mallee, the southern Mallee, Wimmera, and higher rainfall zone, (HRZ) in the Western District, respectively. Mean annual minimum and maximum temperature over the four regions increased by 1.9 and 2.2°C, respectively. A pair-wise comparison of the yield/anthesis biomass ratio across climate scenarios, used for assessing haying-off response, revealed that there was a 39, 49 and 47% increase in frequency of haying-off for the northern Mallee, southern Mallee and Wimmera, respectively, when crops were sown near the historically optimal time (1 June). This translated to a reduction in yield from 1.6 to 1.4 t/ha (northern Mallee), 2.5 to 2.2 t/ha (southern Mallee) and 3.7 to 3.6 t/ha (Wimmera) under a future climate. Sowing earlier (1 May) reduced the impact of a future climate on haying-off where decreases in yield/anthesis biomass ratio were 24, 28 and 23% for the respective regions. Heavy textured soils exacerbated the impact of a future climate on haying-off within the Wimmera. Within the HRZ of the Western District crops were not water limited during grain filling, so no evidence of haying-off existed where average crop yields increased by 5% under a future climate (6.4–6.7 t/ha). The simulated effect of eCO2 alone (FACE conditions) increased average yields from 18 to 38% for the semiarid regions but not in the HRZ and there was no evidence of haying-off. For a future climate, sowing earlier limited the impact of hotter, drier conditions by reducing pre-anthesis plant growth, grain set and resource depletion and shifted the grain-filling phase earlier, which reduced the impact of future drier conditions in spring. Overall, earlier sowing in a Mediterranean-type environment appears to be an important management strategy for maintaining wheat production in semiarid cropping regions into the future, although this has to be balanced with other agronomic considerations such as frost risk and weed control.


2014 ◽  
Vol 40 (2) ◽  
pp. 170-177 ◽  
Author(s):  
Sarsha Gorissen ◽  
Jacqueline Mallinson ◽  
Matthew Greenlees ◽  
Richard Shine

Web Ecology ◽  
2008 ◽  
Vol 8 (1) ◽  
pp. 47-54 ◽  
Author(s):  
T. D. Auld ◽  
M. K. J. Ooi

Abstract. We examine the patterns of germination response to fire in the fire-prone flora of the Sydney basin, south-eastern Australia, using examples from several decades of research. The flora shows a strong response to fire-related germination cues. Most species show an interaction between heat and smoke, a number respond only to heat, whilst a few are likely to respond only to smoke. Many recruit in the first 12 months after fire and show no obvious seasonal patterns of recruitment, whilst several species have a strong seasonal germination requirement, even in this essentially aseasonal rainfall region. Key challenges remaining include designing future seed germination studies within the context of informing the germination response surface to smoke and heat interactions, and incorporation of the impact of varying soil moisture on seed germination post-fire, including its affect on resetting of seed dormancy. An understanding of the resilience of species to frequent fire also requires further work, to identify species and functional types most at risk. This work must ideally be integrated within the framework of the management of fire regimes that will change under a changing climate. We suggest that the functional classification of plant types in relation to fire could be enhanced by a consideration of both the type of germination response to fire (type of cues required) and the timing of the response (seasonally driven in response to seed dormancy characteristics, or independent of season). We provide a simplified version of such an addition to functional trait classification in relation to fire.


2006 ◽  
Vol 46 (10) ◽  
pp. 1323 ◽  
Author(s):  
K. L. Hollaway ◽  
R. S. Kookana ◽  
D. M. Noy ◽  
J. G. Smith ◽  
N. Wilhelm

Grain growers in south-eastern Australia have reported unexpected crop failures with theoretically safe recropping periods for acetolactate synthase herbicides in alkaline soils. This experience has led to the concern that these herbicides may degrade very slowly in alkaline soils, and herbicide residues have at times been blamed for unexplained crop losses. To address this issue, we established 5 recropping trials across Victoria and South Australia with 5 acetolactate synthase herbicides (chlorsulfuron, triasulfuron, metsulfuron-methyl, imazethapyr, and flumetsulam). The herbicides were applied to separate plots in years 1, 2 or 3, and sensitive crop species were sown in year 4 to measure the impact of herbicide residues. We observed that the persistence of the sulfonylureas (chlorsulfuron, triasulfuron, metsulfuron-methyl) varied between herbicides, but all persisted longer in alkaline soils than in acid soils, and were, therefore, more likely to damage crops in alkaline soil. Imazethapyr persisted longer in clay soils than in sandy soils and was, therefore, more likely to damage crops in clay soils. All herbicides persisted longer when rainfall was below average. Canola was more sensitive to imazethapyr than either pea, lentil or medic, but was less sensitive to the sulfonylureas. In contrast, lentil and medic were the most sensitive to sulfonylureas. Despite some damage, we found that safe recropping periods could be predicted from the product labels in all but one situation. The sole exception was that metsulfuron-methyl reduced dry matter and yield of lentil and medic sown 10 months after application in a soil with pH 8.5. We hypothesise that the real cause of crop failure in many situations is not unusual herbicide persistence, but failure to take full account of soil type (pH and clay content including variation in the paddock) and rainfall when deciding to recrop after using acetolactate synthase herbicides.


2011 ◽  
Vol 2011 ◽  
pp. 1-11 ◽  
Author(s):  
David Nash ◽  
Craig Butler ◽  
Justine Cody ◽  
Michael St. J. Warne ◽  
Mike J. McLaughlin ◽  
...  

Biosolids were applied to a pasture and a vineyard in south-eastern Australia. At both sites, soil Cd, Cu, and Zn concentrations linearly increased with biosolids application rates although not to the extent of exceeding soil quality guidelines. Biosolids marginally increased soil C and N concentrations at the pasture site but significantly increased P concentrations. With lower overall soil fertility at the vineyard, biosolids increased C, N, and P concentrations. At neither site did biosolids application affect soil microbial endpoints. Biosolids increased pasture production compared to the unfertilised control but had little effect on grape production or quality. Interestingly, over the 3-year trial, there was no difference in pasture production between the biosolids treated plots and plots receiving inorganic fertiliser. These results suggest that biosolids could be used as a fertiliser to stimulate pasture production and as a soil conditioner to improve vineyard soils in this region.


2001 ◽  
Vol 41 (8) ◽  
pp. 1167 ◽  
Author(s):  
Philip J. Newton

Use of urea fertiliser for cereal cropping in south eastern Australia has increased rapidly in recent years to arrest a general decline in grain protein and to increase yields. In conservation cropping systems, crop stubbles provide a source of carbon, which has the potential to retain a portion of the fertiliser nitrogen in the soil. The impact of fertiliser nitrogen was compared under 4 stubble management regimes for efficiency of nitrogen uptake by a wheat crop in a long-term cereal–grain legume rotation. The experiment was established on a duplex red-brown earth in 1985 to compare stubble retention (standing, shredded, incorporated) with stubble burning. In 1995, wheat following a failed lupin crop was topdressed with urea fertiliser at 50 kg nitrogen per hectare to split plots of each stubble treatment at the third-leaf stage of growth. The urea significantly increased nitrogen uptake by wheat grown on burnt stubbles and increased grain yield by 1 t/ha. Nitrogen applied to wheat grown on stubbles retained above-ground increased yield by 0.5 t/ha, whereas there was no significant yield increase from nitrogen when stubble was incorporated due to less transfer of dry matter to grain. Efficiency of urea-nitrogen uptake in grain was reduced under stubble retention. The total grain nitrogen uptake in response to stubble burning increased by 17.6 kg/ha, which was equivalent to a conversion efficiency of 35%, compared with only 26, 24 and 16% of the applied 50 kg nitrogen per hectare for stubble standing, shredding and incorporation treatments, respectively. Soil organic carbon and total nitrogen levels were 1 and 0.1%, respectively, irrespective of stubble treatment. Added urea increased microbial decomposition of cellulose in calico cloth buried beneath stubbles retained above-ground by 30%, compared with stubble incorporated or burnt treatments. These results suggest that where low levels of available nitrogen exist in cropping systems that use stubble retention, higher nitrogen inputs may be needed, due to less efficient uptake of nitrogen from urea fertiliser.


2013 ◽  
pp. n/a-n/a ◽  
Author(s):  
Stephanie J. Kermode ◽  
Martin R. Gibling ◽  
Brian G. Jones ◽  
Tim J. Cohen ◽  
David M. Price ◽  
...  

Soil Research ◽  
2015 ◽  
Vol 53 (3) ◽  
pp. 227 ◽  
Author(s):  
Sally Jane Officer ◽  
Frances Phillips ◽  
Gavin Kearney ◽  
Roger Armstrong ◽  
John Graham ◽  
...  

Although large areas of semi-arid land are extensively cropped, few studies have investigated the effect of nitrogen (N) fertiliser on nitrous oxide (N2O) emissions in these regions (Galbally et al. 2010). These emissions need to be measured in order to estimate N losses and calculate national greenhouse gas inventories. We examined the effect of different agronomic management practices applied to wheat (Triticum aestivum) grown on an alkaline Vertosol in south-eastern Australia on N2O emissions. In 2007, N2O emissions were measured over 12 months, during which N fertiliser (urea) was applied at sowing or N fertiliser plus supplementary irrigation (50 mm) was applied during the vegetative stage and compared with a treatment of no N fertiliser or irrigation. In a second experiment (2008), the effect of source of N on N2O emissions was examined. Wheat was grown on plots where either a pulse (field peas, Pisum sativum) or pasture legume (barrel medic, Medicago truncatula) crop had been sown in the previous season compared with a non-legume crop (canola, Brassica napus). To account for the N supplied by the legume phase, N fertiliser (50 kg N ha–1 as urea) was applied only to the wheat in the plots previously sown to canola. Fluxes of N2O were measured on a sub-daily basis (up to 16 measurements per chamber) by using automated chamber enclosures and a tuneable diode laser, and treatment differences were evaluated by a linear mixed model including cubic smoothing splines. Fluxes were low and highly variable, ranging from –3 to 28 ng N2O-N m–2 s–1. The application of N fertiliser at sowing increased N2O emissions for ~2 months after the fertiliser was applied. Applying irrigation (50 mm) during the vegetative growth stage produced a temporary (~1-week) but non-significant increase in N2O emissions compared with plots that received N fertiliser at sowing but were not irrigated. Including a legume in the rotation significantly increased soil inorganic N at sowing of the following wheat crop by 38 kg N ha–1 (field peas) or 57 kg ha–1 (barrel medic) compared with a canola crop. However, N2O emissions were greater in wheat plots where N fertiliser was applied than where wheat was sown into legume plots where no N fertiliser was applied. Over the 2 years of the field study, N2O emissions attributed to fertiliser ranged from 41 to 111 g N2O-N ha–1, and averaged of 75 g N2O-N ha–1 or 0.15% of the applied N fertiliser. Our findings confirm that the proportion of N fertiliser emitted as N2O from rainfed grain crops grown in Australian semi-arid regions is less than the international average of 1.0%.


Soil Research ◽  
2007 ◽  
Vol 45 (8) ◽  
pp. 607 ◽  
Author(s):  
P. Hopmans ◽  
N. Collett ◽  
R. Bickford

A study was undertaken to assess the effects of fire retardant application, unmodified by heat of fire, on soil properties in 2 fire-prone heathland communities at Marlo and the Grampians in south-eastern Australia. Fire retardant (Phos-Chek D75-R at 0.144 g/L) was applied at rates of 0.5, 1.0, and 1.5 L/m2 and compared with control treatments of nil and 1.0 L/m2 of water. Monitoring of surface soils showed that pH at both sites decreased while soil salinity increased immediately after application followed by a rapid decline to pre-treatment values within 12 months. The impact of retardant on total carbon and nitrogen was minor and within the range of natural variation of C and N in surface soils at both sites. Levels of readily available or labile forms of N increased at both sites but declined rapidly to background values after 12 months. Applications of retardant progressively increased extractable P in the surface soil at Marlo, in contrast to the Grampians where a rapid increase was observed after two months followed by a decline after 12 months. These results showed a significant increase in labile P in the surface soil after 12 months and also indicated that a large proportion of the phosphate applied had leached into the subsoil. Likewise, fire retardant applied at the highest rate caused increases in labile sulfate after 2 months at both sites, followed by a rapid decline to background levels. It is expected that the elevated levels of soil phosphate in particular could have a long-term impacts on growth and composition of heathland vegetation known to be sensitive to elevated levels of phosphate in soil.


Sign in / Sign up

Export Citation Format

Share Document