Long-term effects of stubble management on the incidence of infection of wheat by Fusarium graminearum Schw. Group 1

1993 ◽  
Vol 33 (4) ◽  
pp. 451 ◽  
Author(s):  
LW Burgess ◽  
D Backhouse ◽  
BA Summerell ◽  
AB Pattison ◽  
TA Klein ◽  
...  

The effect of 3 stubble management regimes (burning after harvest, incorporation with a disc plough, retention on the surface) on the incidence of infection of wheat with Fusarium graminearum Schw. Group 1 was studied for 5 seasons at 2 sites at Moree, New South Wales. One site had high initial incidence (site A) and the other low initial incidence (site B). There were no differences in incidence of infection between retained and incorporated treatments. Stubble burning reduced the increase in incidence of infection in 2 of 5 years at site A and 3 of 4 years at site B. Failure of control in other years was attributed to susceptible weed hosts and poor burns. When stubble was retained on the plots at site B that had been burnt, incidence of infection in the next season increased to a level not significantly different from the retained or incorporated treatments. Incidence of infection at the fourth consecutive wheat crop at both sites was close to the maximum recorded, which was 92% at site A and 65% at site B. There was no evidence of a decline in incidence by the time of the most recent season assessed (eighth year of continuous wheat cultivation at site A, and sixth year at site B). In most years, the differences in yield between treatments were not significant.


1980 ◽  
Vol 31 (2) ◽  
pp. 239 ◽  
Author(s):  
ICR Holford

The long-term effects of varying durations of lucerne Icy, extended fallowing, and continuous wheat growing on the growth, yield, and nitrogen uptake of subsequent wheat crops were determined on two contrasting soils in northern New South Wales. Durations of lucerne ley were 3+, 2+ and 1+ years on a black earth and 5+, 3+ and 1+ years on a red-brown earth. With the exception of the first wheat crop, wheat production for several years following lucerne exceeded that following extended fallow or continuous wheat growing, whether measured as vegetative yield at anthesis, grain yield, nitrogen uptake, or grain protein. The beneficial effects of lucerne on vegetative yield, nitrogen uptake, and grain protein reached a maximum in the second crop after lucerne, and the effects of 2+ or more years of lucerne remained significant for the next five crops on the black earth and the next two crops on the red-brown earth. Grain yields fluctuated widely with season, the magnitude of the lucerne effect being much more dependent on rainfall, but the duration of the effect was similar for grain and vegetative parameters. The shorter duration of the lucerne effect on the red-brown earth appeared to be associated with its more freely draining nature and consequent loss of accumulated nitrogen. The optimum duration of lucerne for maintaining nitrogen-dependent wheat yields was 3+ years on both soil types. It eliminated the need for nitrogen fertilizer for the following five wheat crops on the black earth and three wheat crops on the red-brown earth. Extended fallowing also had a beneficial effect on all parameters, particularly in the first and second crops after the fallow ended. Its effect was generally significantly smaller than the lucerne effect except in the first crop after fallow.



2006 ◽  
Vol 83 (3) ◽  
pp. 219-222 ◽  
Author(s):  
L. Yap ◽  
T. Butler ◽  
J. Richters ◽  
K. Kirkwood ◽  
L. Grant ◽  
...  




1995 ◽  
Vol 35 (6) ◽  
pp. 765 ◽  
Author(s):  
KE Nelson ◽  
LW Burgess

We investigated the incidence of Fusarium graminearum Group 1 (infection, stem colonisation) and crown rot in 3-year crop sequences of 1 or 2 years of barley, oats, or mown oats followed by wheat, compared with 3 years of wheat. Seed was sown into the stubble of the previous crop. Stubble production was estimated for each cereal treatment. Plants of each cereal were infected by the crown rot pathogen. Oats were susceptible to infection but did not express symptoms of crown rot in 2 years of the trial. Oats can, therefore, be considered a symptomless host that may contribute to the maintenance of inoculum. The overall mean incidence of infected plants increased from 12% in 1987 to 81% in 1989. The various treatments did not significantly reduce the incidence of infected wheat plants in November of the final year. The incidence of crown rot of wheat in 1989 was greatest after 2 prior wheat crops and lowest after 1 or 2 years of mown oats. The 3 species produced a similar amount of straw by weight; however, mown oats produced significantly less. Oat straw decomposed more rapidly than that of other cereals in controlled conditions.



1987 ◽  
Vol 27 (4) ◽  
pp. 533 ◽  
Author(s):  
SM Bromfield ◽  
RW Cumming ◽  
DJ David ◽  
CH Williams

Soil profiles from limed and unlimed commercial pastures and from lime trials on pastures in the Crookwell district of the Southern Tablelands of New South Wales were sampled and pH measured at 2- or 5-cm intervals to depths ranging from 10 to 60 cm. A single application of lime (3.6-5.6 t/ha depending on the soil) incorporated into the surface 10 cm had a long-term effect and maintained pH above 5.5 in the top 30 cm for at least 12 years. Lime applied as a topdressing to soils on granite raised the pH by at least 0.2 pH units to a depth of 15 cm after 6 years. The depth affected was less on the heavier-textured basaltic soils and on the initially more acid sedimentary soils. There appears to be a role for top-dressing with lime to prevent subsurface acidity from developing under pastures and to correct it in the upper layers of light textured soils. The pH profiles from a given treatment were variable and highlighted the problem of obtaining a field measurement for soil pH that is representative of the plant's environment.



1995 ◽  
Vol 35 (7) ◽  
pp. 915 ◽  
Author(s):  
WL Felton ◽  
H Marcellos ◽  
RJ Martin

Four experiments were commenced after a 1980 wheat crop, and a fifth after the 1981 crop, at different sites representing the major soil types of northern New South Wales in the 550-700 mm rainfall zone, to examine the influence of 3 fallow management practices [no tillage (NT); stubble retention after harvest, cultivation (SM); stubble burning after harvest, cultivation (SB)] on wheat production. Data considered in this paper cover the continuous wheat subtreatments of the 5 experiments (1981-90). Nitrogen applied at 50 kg Nlha in addition to the basal treatment was included as a treatment from 1986 to 1988. Across all sites and seasons, grain yields were in the order SB>SM = NT, stubble retention having a greater effect than tillage. In some years at some sites, differences in grain yield and grain N yield were not significant. In others, when significant yield differences occurred, variations in grain yield and grain N yield were highly correlated with differences in soil N available for the crop. The data show that the influence of fallow management interacted with season and crop nutrition, and required long-term study for proper assessment.



Soil Research ◽  
1994 ◽  
Vol 32 (4) ◽  
pp. 795 ◽  
Author(s):  
ICR Holford ◽  
BE Schweitzer ◽  
GJ Crocker

Measurements of phosphorus (P) sorption, isotopically exchangeable, KCl soluble and extractable P (Bray(1)) were carried out on limed and unlimed soils from eight pasture experiments on the Northern Tablelands of New South Wales at intervals of 1, 2 and 3 years after lime application. Lime increased soil pH by a minimum of 0.5 to a maximum of 1.55 units, and there were corresponding decreases in soluble aluminium and manganese. Lime decreased P sorptivity in every soil and at every sampling, but decreases were usually largest at the first sampling. They were attributed to the pH-induced increase in surface negative charge and the smaller increases in calcium concentrations of these freely drained soils, compared with undrained potted soils, of a previous glasshouse experiment. Isotopically exchangeable P was increased by the highest lime rate (5 t/ha) in all but one soil at the first sampling, while soluble P was increased by both lime rates in all soils. Increases in exchangeable P tended to decline at successive samplings, but increases in soluble P sometimes increased and sometimes decreased with time. In general, lime-induced increases in soluble P were consistent with decreases in P sorptivity, although the primary cause of the increases was probably the dissolution of iron and aluminium phosphates. All these changes were conducive to the increased plant availability and uptake of soil and fertilizer P.



1986 ◽  
Vol 8 (2) ◽  
pp. 140 ◽  
Author(s):  
BH Downing

Examination of data on dietary preferences of sheep, goats and cattle suggests that different grazing systems are desirable for each of the three major woodland types (belah-rosewood, mulga, poplar box) examined. Competition for herbs, frequently palatable to all animal species, indicates that goats and sheep are unsuitable for joint use either in heavily wooded country or where annual herbaceous production is less than 200 kg-ha. Supplementary feeding, fire and judicious stocking are proposed as a strategy for inducing goats to eat a proportion of unpalatable shrubs. The literature provides little helpful information on how rangelands in the Western Division should be managed. No reports are given on comparisons of grazing systems, such as rotational grazing, rotational resting, and continuous grazing. No guidance is given on grazing after burning of the rangeland. Recommendations are generally against the use of goats for control of woody plants, whereas local observation shows this to be an apparently effective practice. The recommendations are mostly based on experimental procedures which, although suitable for detecting animal dietary preferences in the short term, are less appropriate for investigation of the effects of grazing on range condition in the long term. Some suggestions are made towards a different approach for: investigating the effects of grazing by sheep and goats on rangeland condition, and the economic implications of this in terms of animal production.



2020 ◽  
Author(s):  
John Nell

Abstract The 120-year-old Sydney rock oyster industry in New South Wales (NSW) and southern Queensland is one of the oldest aquaculture industries in Australia. The industry has been forced to adapt to competition from other species, tighter harvesting and oyster storage and handling requirements as well as eroding profit margins. Recent changes in farming practices include the move away from stick culture to single seed culture, as the half-shell market demands a more uniformly shaped oyster. When selective breeding demonstrated that it could reduce time to market (50 g whole weight) by nearly a year out of an industry average of 3.5 years, the industry wanted to try hatchery technology. Although the industry had never used hatchery technology before, it purchased 10 million spat or 8% of its annual spat requirement from hatcheries in 2003-2004, the first year that they were made available to farmers. The industry also embraced the Australian Shellfish Quality Assurance Program, which requires that shellfish harvest areas be classified on the basis of a sanitary survey and the results of an ongoing strategic water-sampling programme. This programme ensures product safety for the consumers and helps to provide the industry with a long-term future.



2021 ◽  
Vol 36 (Supplement_1) ◽  
Author(s):  
Natalia Stepanova ◽  
Ganna Tolstanova ◽  
Valentyn Nepomnyashchii ◽  
Iryna Akulenko ◽  
Svitlana Savchenko ◽  
...  

Abstract Background and Aims Gut microbiota is considered an important factor affecting oxalate handling in the intestine. It has been demonstrated that intestinal oxalate secretion provides a complementary route of excretion, and it becomes more evident when kidney function declines. A diversity of gut oxalate-degrading bacteria (ODB) has been hypothesized to play a role in this process. However, there is a general lack of research on the long-term effects of acute kidney injury (AKI) on ODB and their total oxalate-degrading activity (ODA) in fecal microbiota. In this study, we evaluated whether renal dysfunction could affect intestinal ODB and their total ODA in a rat model of glycerol-induced AKI. Method The Male Wistar rats (200-300 g, n=20) on oxalate-free diet were randomly divided into 2 groups. After 24-h of water deprivation, Group 1 (n=10) received an intramuscular injection of 50% glycerol (10 ml/kg of body weight), and Group 2 (n=10) served as control. The numbers of ODB (incubated in a highly selective Oxalate Medium and determined using culture method) and total fecal ODA were measured after injection on days 7 and 70. The method of redoximetric titration with a KMnO4 solution was adopted to evaluate total ODA in fecal microbiota; the results were expressed as % of oxalate degradation per 0.01 g of feces. Renal injury was assessed by histopathological examination, serum creatinine and daily proteinuria levels after removing the animals from the experiment on day 70. Cortical interstitial fibrosis was measured by computerized image analysis on sections stained with picrosirius red. The median (Me) and the interquartile ranges (Q25; Q75) were calculated and compared using the nonparametric Mann-Whitney test. The Spearman correlation coefficient was used to evaluate association between the examined parameters. Results The obtained results demonstrated: 1) after glycerol injection on day 7, no differences were found in the numbers of ODB and total fecal ODA between the experimental and control groups: 5.9 (5.4-6.0) vs 6.0 (5.4-6.4) CFU/g, p=0.65 and 2.0 (0.1-5.0) vs 2.5 (2.0-9.0) %/0.01g, p=0.24, respectively; 2) after AKI initiation on day 70, the numbers of ODB and total fecal ODA were significantly lower in Group I compared with control Group II (Fig. 1); 3) the higher percentage of renal interstitial fibrosis was, the higher total fecal ODA occurred in the experimental rats (Fig. 2). In addition, the number of ODB in feces in Group 1 had an inverse association with serum creatinine (r=-0.52, p=0.006) and 24-h proteinuria levels (r=-0.86, p<0.0001). Conclusion AKI had the long-term negative effects on the quantitative and qualitative characteristics of ODB in fecal microbiota in rats. Moreover, the results of our study confirmed an increasing trend in total fecal ODA according to the aggravation of renal interstitial fibrosis in rats.



Sign in / Sign up

Export Citation Format

Share Document