Correction

Soil Research ◽  
1995 ◽  
Vol 33 (4) ◽  
pp. 733
Author(s):  
SR Walker ◽  
WM Blacklow

Adsorption and Degradation of Triazine Herbicides in Soils Used for Lupin Production in Western-Australia - Laboratory Studies and a Simulation-Model (Vol 32, Pg 1189, 1994)


Soil Research ◽  
1994 ◽  
Vol 32 (6) ◽  
pp. 1189
Author(s):  
SR Walker ◽  
WM Blacklow

Most lupins (Lupinus angustifolius L. and L. albus L.) grown in Western Australia are sown with simazine, and some with atrazine, to give persistent control of a broad spectrum of weeds. Rates of application are adjusted for soil types yet there can be ineffective weed control and crop damage. The kinetics of degradation in four soils was studied in the laboratory to determine how it varied between soils and was modified by soil temperature, pH, moisture and gamma irradiation. The time for half the herbicide to be lost from the soils (HL) varied from 42 to 110 days at 20�C and -0.08 MPa water potential. Loss was rapid in the first day of incubation and subsequent losses were described precisely by first-order functions. However, the first-order half-lives (t1/2) were 3-21 days greater than the corresponding HLs, because the first-day losses were unaccounted for by the first-order functions. Gamma irradiation had no influence on degradation kinetics which supported chemical hydrolysis as the mechanism of degradation. The t1/2 values were correlated positively with the proportion of applied herbicide that was adsorbed by the soils (PAd). Atrazine was more persistent than simazine and had higher PAd values. The PAd values increased with soil pH, organic matter and clay content. The t1/2 values decreased exponentially with temperatures from 28 to 9-degrees-C, and decreased with soil water potentials from -0.08 to -1.50 MPa for a loamy sand at a near-neutral pH. A computer simulation model gave good agreement with observed residue decays and showed that the initially rapid losses from the soils could be explained by high rates of hydrolysis when all the applied herbicide was in the soil solution and, consequently, herbicide concentrations were high (87-100 mM). Rapid losses of the triazines in the field are likely in warm, acidic soils-particularly if the herbicide concentrations in the soil solution are high for reasons of limited vertical distribution of the applied herbicides through the soil profile.



2005 ◽  
Vol 53 (5) ◽  
pp. 283
Author(s):  
Fabien Aubret ◽  
Xavier Bonnet ◽  
David Pearson ◽  
Richard Shine

On a small island off south-western Australia, tiger snakes (Notechis scutatus, Elapidae) continue to survive, feed, grow and reproduce successfully after being blinded by seagulls defending their chicks. We propose two alternative hypotheses to explain this surprising result: either vision is of trivial importance in tiger snake foraging, or the blinded snakes survive on a diet of abundant immobile prey that cannot escape their approach. Laboratory studies in which we blindfolded snakes falsified the first hypothesis: snakes that were unable to see had great difficulty in capturing mobile prey. Field data support the second hypothesis: blind snakes feed almost entirely on seagull chicks, whereas normal-sighted animals also took fast-moving prey (lizards and mice). Thus, the ability of tiger snakes on Carnac Island to survive without vision is attributable to the availability of abundant helpless prey (seagull chicks) in this insular ecosystem.



1981 ◽  
Vol 113 (11) ◽  
pp. 1025-1033 ◽  
Author(s):  
B. D. Frazer ◽  
B. Gill

AbstractHunger and abundance of adult Coccinella californica (Mannerheirn) preying on pea aphids, Acyrthosiphon pisum (Harris), on alfalfa, Medicago sativa L., were assessed in the field. There was no consistent relationship between abundance or hunger and aphid density. The field beetles encountered in sampling were 4 times hungrier at each observed aphid density than expected from a simulation model of the predation that is driven by hunger. Laboratory studies revealed a circadian rhythm of beetle activity that was modified by hunger. An index of predatory potential (number of beetles times average hunger) was found to be a better indicator of impact of the beetles than absolute numbers of all coccinellids or hunger separately. It is suggested that the beetles encountered in simply walking through a field produce the most useful index of numbers of hungry beetles. This census must be done daily however because of the high vagility of adult coccinellids.



1997 ◽  
Vol 37 (8) ◽  
pp. 845 ◽  
Author(s):  
D. C. Lewis ◽  
M. D. A. Bolland ◽  
R. J. Gilkes ◽  
L. J. Hamilton

Summary. Most of the research on the effectiveness of phosphorus (P) fertilisers in Australia has involved comparing phosphate rock (PR) or partially acidulated PR (PAPR) with superphosphate (SP) or other water-soluble P fertilisers. There are many estimates of effectiveness (current relative effectiveness or CRE) which compared freshly-applied (current) PR and freshly-applied (current) SP. The CRE values for PR range from <0.1 to 2.5, with a mean value for apatite PR of 0.26 and 0.43 for calcined calcium iron aluminium PR (Calciphos). As measured in field experiments in the years after application, and using current SP as a basis for comparison, the residual effectiveness of PR (residual value or RV) is low and constant for up to 11 years after application. Phosphate rock is 5–30% as effective as current SP. The average value of RV for SP declines by about 40% in the first year after application, followed by a further 15% in the second year, and a further 30% over the remaining 6 years. Values of relative effectiveness and RV, and the rate of decline in RV differ substantially between sites and sometimes between plant species. Laboratory studies of reactions between PR and soil have shown that the poor effectiveness of PR is primarily due to the limited extent and rate of dissolution of these fertilisers compared with the almost complete and rapid dissolution of water-soluble P fertilisers. Many Australian soils are only moderately acid (pH in water >5.5) with low pH buffering capacities and they cannot quickly contribute a large supply of hydrogen ions to promote rapid dissolution of PR. Soils are commonly sandy and have low water-holding capacities; in the strongly seasonal Mediterranean climate of south-western and southern Australia, the fertilised surface soil rapidly dries between rains thereby restricting PR dissolution. This restricted dissolution contributes to the poor agronomic effectiveness of PR fertilisers. Studies in Western Australia have shown that the effectiveness of current and residual PR relative to current SP generally decreases with increasing level of application. Therefore, relative to current SP, PR fertilisers become less effective per unit of PR as more is applied to the soil. Consequently, PR fertilisers frequently cannot support the same maximum yield as current SP. Published work indicates that PR fertilisers cannot be regarded as economic substitutes for SP for most agricultural applications in Australia. However, much Australian research has used low reactive PRs in conditions that are not likely to favour even highly reactive PRs. The soils dry out between rains during the growing season and have insufficient hydrogen ions to cause rapid, extensive dissolution of even reactive PR. Research elsewhere has suggested that reactive apatite PRs can be as effective as SP for suitable soils and environments. These are soils that remain wet for the whole growing season and which contain sufficient hydrogen ions to cause rapid dissolution of reactive PR. Laboratory studies, in which there is no P leaching, on 254 different soils collected from throughout south-western Australia showed that 29 soils, all collected from >800 mm average annual rainfall areas, dissolved >40% highly reactive North Carolina PR, suggesting that in the field these soils could be suitable for highly reactive PRs. Insufficient research has been conducted in the high rainfall areas of Australia, where the environment is more likely to favour highly reactive PR, and PAPR made from highly reactive PR. Therefore, a national program was undertaken in 6 Australian states to identify circumstances under which PRs, including reactive PR and PAPR made from reactive PR, may be economic fertilisers for acidic soils in the high rainfall areas of Australia where agricultural production is largely based on pasture production.



2017 ◽  
Vol 57 (10) ◽  
pp. 2082 ◽  
Author(s):  
E. Hussein ◽  
D. T. Thomas ◽  
L. W. Bell ◽  
D. Blache

Grazing immature cereal crops, particularly different varieties of wheat, has become widely adopted in the high rainfall areas of southern Australia. Recently, there has been growing interest in applying this technology in drier parts of the mixed farming zones of Western Australia. A modelling study was conducted to examine farm business returns with or without the grazing of immature wheat (winter and spring varieties) in different locations of Western Australia (Merredin, Wickepin and Kojonup), representing the low to high rainfall (319–528 mm) cropping regions, respectively. A combination of APSIM (crop simulation model) and GrassGro (pasture and livestock simulation model), were used to evaluate the changes in farm gross margins with the grazing of cereal crops at three locations of Western Australia. The results of the study showed that grazing the two wheat varieties (dual-purpose winter and spring) at the high rainfall location increased the profitability of the livestock enterprise by 2.5 times more than grazing crops at both low rainfall locations (P < 0.05). Across all years and sites, the average supplementary feeding costs were reduced by the inclusion of grazed winter (12%) and spring (2%) wheat crops in the lamb production system. The comparative reduction in the cost of supplementary feeding varied between locations and by crop variety within locations, due to both the frequency and average duration of the grazing of wheat crops in these regions, and the farm-stocking rate that was chosen. Both wheat varieties were grazed frequently at the lowest rainfall site (68% and 30% of years for winter and spring wheat varieties respectively), whereas grazing spring wheat was less frequent at the higher rainfall location and averaged 16% of years due to a greater difference in the relative availability of wheat crops versus pasture for grazing among regions. The grazing model assumed that there were abundant productive mixed ryegrass and subterranean clover pasture in the farming system. Overall, this study suggests that both winter and spring wheat crops are likely to supply green feed during the winter feed shortage (April–July) and reduce supplementary feed requirements for a short period of time in some seasons. The value of grazing crops is likely to be higher on farms with poorer soils and less productive pastures.



Soil Research ◽  
1994 ◽  
Vol 32 (6) ◽  
pp. 1189
Author(s):  
SR Walker ◽  
WM Blacklow

Most lupins (Lupinus angustifolius L. and L. albus L.) grown in Western Australia are sown with simazine, and some with atrazine, to give persistent control of a broad spectrum of weeds. Rates of application are adjusted for soil types yet there can be ineffective weed control and crop damage. The kinetics of degradation in four soils was studied in the laboratory to determine how it varied between soils and was modified by soil temperature, pH, moisture and gamma irradiation. The time for half the herbicide to be lost from the soils (HL) varied from 42 to 110 days at 20�C and -0.08 MPa water potential. Loss was rapid in the first day of incubation and subsequent losses were described precisely by first-order functions. However, the first-order half-lives (t1/2) were 3-21 days greater than the corresponding HLs, because the first-day losses were unaccounted for by the first-order functions. Gamma irradiation had no influence on degradation kinetics which supported chemical hydrolysis as the mechanism of degradation. The t1/2 values were correlated positively with the proportion of applied herbicide that was adsorbed by the soils (PAd). Atrazine was more persistent than simazine and had higher PAd values. The PAd values increased with soil pH, organic matter and clay content. The t1/2 values decreased exponentially with temperatures from 28 to 9-degrees-C, and decreased with soil water potentials from -0.08 to -1.50 MPa for a loamy sand at a near-neutral pH. A computer simulation model gave good agreement with observed residue decays and showed that the initially rapid losses from the soils could be explained by high rates of hydrolysis when all the applied herbicide was in the soil solution and, consequently, herbicide concentrations were high (87-100 mM). Rapid losses of the triazines in the field are likely in warm, acidic soils-particularly if the herbicide concentrations in the soil solution are high for reasons of limited vertical distribution of the applied herbicides through the soil profile.



Sign in / Sign up

Export Citation Format

Share Document