Seed Size and Burial Effects on Giant Ragweed (Ambrosia trifida) Emergence and Seed Demise

Weed Science ◽  
2007 ◽  
Vol 55 (1) ◽  
pp. 16-22 ◽  
Author(s):  
S. K. Harrison ◽  
E. E. Regnier ◽  
J. T. Schmoll ◽  
J. M. Harrison

Giant ragweed is a competitive, allergenic weed that persists in agricultural fields and early successional sites. Field experiments were conducted to determine the effects of seed size and seed burial depth on giant ragweed emergence and seed demise. In a seedling emergence experiment, small (< 4.8 mm in diameter) and large (> 6.6 mm in diameter) seeds were buried 0, 5, 10, and 20 cm in fall 1997, and weed emergence was monitored over the next seven growing seasons. A generalized linear mixed model fit to the cumulative emergence data showed that maximum emergence for both seed sizes occurred at the 5-cm burial depth, where probability of emergence was 19% for small seeds and 49% for large seeds. Emergence probability at the 10-cm burial depth was 9% for small seeds and 30% for large seeds, and no seedlings emerged from the 20-cm burial depth. The model predicted that ≥ 98% of total cumulative emergence was completed after four growing seasons for large seeds buried 5 cm, five growing seasons for small seeds buried 5 cm and large seeds buried 10 cm, and seven growing seasons for small seeds buried 10 cm. Seed size and burial treatment effects on seed demise were tested in a second experiment using seed packets. Rates of seed demise were inversely proportional to burial depth, and the percentage of viable seeds remaining after 4 yr ranged from 0% on the soil surface to 19% at the 20-cm burial depth. Some seeds recovered from the 20-cm burial depth were viable after 9 yr of burial. These results, coupled with previous research, suggest that seed size polymorphism facilitates giant ragweed adaptation across habitats and that a combination of no-tillage cropping practices, habitat modification, and timely weed control measures can reduce its active seed bank in agricultural fields by 90% or more after 4 yr.

2020 ◽  
Vol 110 (10) ◽  
pp. 1623-1631
Author(s):  
Karyn L. Reeves ◽  
Clayton R. Forknall ◽  
Alison M. Kelly ◽  
Kirsty J. Owen ◽  
Joshua Fanning ◽  
...  

The root lesion nematode (RLN) species Pratylenchus thornei and P. neglectus are widely distributed within cropping regions of Australia and have been shown to limit grain production. Field experiments conducted to compare the performance of cultivars in the presence of RLNs investigate management options for growers by identifying cultivars with resistance, by limiting nematode reproduction, and tolerance, by yielding well in the presence of nematodes. A novel experimental design approach for RLN experiments is proposed where the observed RLN density, measured prior to sowing, is used to condition the randomization of cultivars to field plots. This approach ensured that all cultivars were exposed to consistent ranges of RLN in order to derive valid assessments of relative cultivar tolerance and resistance. Using data from a field experiment designed using the conditioned randomization approach and conducted in Formartin, Australia, the analysis of tolerance and resistance was undertaken in a linear mixed model framework. Yield response curves were derived using a random regression approach and curves modeling change in RLN densities between sowing and harvest were derived using splines to account for nonlinearity. Groups of cultivars sharing similar resistance levels could be identified. A comparison of slopes of yield response curves of cultivars belonging to the same resistance class identified differing tolerance levels for cultivars with equivalent exposures to both presowing and postharvest RLN densities. As such, the proposed design and analysis approach allowed tolerance to be assessed independently of resistance.


2012 ◽  
Vol 102 (9) ◽  
pp. 867-877 ◽  
Author(s):  
A. B. Kriss ◽  
P. A. Paul ◽  
L. V. Madden

A multilevel analysis of heterogeneity of disease incidence was conducted based on observations of Fusarium head blight (caused by Fusarium graminearum) in Ohio during the 2002–11 growing seasons. Sampling consisted of counting the number of diseased and healthy wheat spikes per 0.3 m of row at 10 sites (about 30 m apart) in a total of 67 to 159 sampled fields in 12 to 32 sampled counties per year. Incidence was then determined as the proportion of diseased spikes at each site. Spatial heterogeneity of incidence among counties, fields within counties, and sites within fields and counties was characterized by fitting a generalized linear mixed model to the data, using a complementary log-log link function, with the assumption that the disease status of spikes was binomially distributed conditional on the effects of county, field, and site. Based on the estimated variance terms, there was highly significant spatial heterogeneity among counties and among fields within counties each year; magnitude of the estimated variances was similar for counties and fields. The lowest level of heterogeneity was among sites within fields, and the site variance was either 0 or not significantly greater than 0 in 3 of the 10 years. Based on the variances, the intracluster correlation of disease status of spikes within sites indicated that spikes from the same site were somewhat more likely to share the same disease status relative to spikes from other sites, fields, or counties. The estimated best linear unbiased predictor (EBLUP) for each county was determined, showing large differences across the state in disease incidence (as represented by the link function of the estimated probability that a spike was diseased) but no consistency between years for the different counties. The effects of geographical location, corn and wheat acreage per county, and environmental conditions on the EBLUP for each county were not significant in the majority of years.


2019 ◽  
Vol 157 (5) ◽  
pp. 382-398 ◽  
Author(s):  
N. A. Cocks ◽  
T. J. March ◽  
T. B. Biddulph ◽  
A. B. Smith ◽  
B. R. Cullis

AbstractThe frost susceptibility of Australian commercial cereal crops, in particular wheat and barley, has become an economically devastating issue for growers. The relative risk to frost damage of the currently available varieties is obtained through testing varieties in a series of field experiments at locations susceptible to frost events (FEs). The experimental design, measurement protocols and resultant data from these frost expression experiments (FEEs) are complex due to the unpredictability of the timing and severity of FEs, and the maturity of the plants at the time of the events. Design and protocol complexities include the use of multiple sowing dates and the recording of plant maturity. Data difficulties include a high degree of unbalance, and in the instance of multiple frosts in a FEE, there is a longitudinal aspect. A linear mixed model analysis was adopted to accommodate these characteristics of individual FEEs and the multi-environment trial analysis of 17 FEEs. Finally, an approach is demonstrated for dissemination of results that are of use to both growers and breeders.


Author(s):  
Tony Arioli ◽  
Scott W. Mattner ◽  
Graham Hepworth ◽  
David McClintock ◽  
Rachael McClinock

AbstractSeaweed extracts are agricultural biostimulants that have been shown to increase the productivity of many crops. The aim of this study was to determine the effect of a seaweed extract from the brown algae Durvillaea potatorum and Ascophyllum nodosum as a soil treatment on the yield of wine grapes grown in Australian production and climate conditions. This study used a series of seven field experiments (2012–2017), across five locations, in three Australian states and four cultivars, and analysed data using a linear mixed model approach. The analysis revealed that recurring soil applications of the seaweed extract significantly increased wine grape yield by an average of 14.7% across multiple growing years that experienced climate extremes. Partial budget analysis showed that the use of the seaweed extract increased profits depending on the grape cultivar. This study is the most extensive investigation of its type in Australian viticulture to understand the effect of a soil-applied seaweed extract on wine grape production.


2021 ◽  
Vol 17 (3) ◽  
pp. e1008669 ◽  
Author(s):  
Jailos Lubinda ◽  
Yaxin Bi ◽  
Busiku Hamainza ◽  
Ubydul Haque ◽  
Adrian J. Moore

While mortality from malaria continues to decline globally, incidence rates in many countries are rising. Within countries, spatial and temporal patterns of malaria vary across communities due to many different physical and social environmental factors. To identify those areas most suitable for malaria elimination or targeted control interventions, we used Bayesian models to estimate the spatiotemporal variation of malaria risk, rates, and trends to determine areas of high or low malaria burden compared to their geographical neighbours. We present a methodology using Bayesian hierarchical models with a Markov Chain Monte Carlo (MCMC) based inference to fit a generalised linear mixed model with a conditional autoregressive structure. We modelled clusters of similar spatiotemporal trends in malaria risk, using trend functions with constrained shapes and visualised high and low burden districts using a multi-criterion index derived by combining spatiotemporal risk, rates and trends of districts in Zambia. Our results indicate that over 3 million people in Zambia live in high-burden districts with either high mortality burden or high incidence burden coupled with an increasing trend over 16 years (2000 to 2015) for all age, under-five and over-five cohorts. Approximately 1.6 million people live in high-incidence burden areas alone. Using our method, we have developed a platform that can enable malaria programs in countries like Zambia to target those high-burden areas with intensive control measures while at the same time pursue malaria elimination efforts in all other areas. Our method enhances conventional approaches and measures to identify those districts which had higher rates and increasing trends and risk. This study provides a method and a means that can help policy makers evaluate intervention impact over time and adopt appropriate geographically targeted strategies that address the issues of both high-burden areas, through intensive control approaches, and low-burden areas, via specific elimination programs.


2020 ◽  
Author(s):  
Hsiang-Yu Yuan ◽  
Jingbo Liang ◽  
Pei-Sheng Lin ◽  
Kathleen Sucipto ◽  
Mesfin Mengesha Tsegaye ◽  
...  

ABSTRACTIn recent years, dengue has been rapidly spreading and growing in the tropics and subtropics. Located in southern China, Hong Kong’s subtropical monsoon climate may favour dengue vector populations and increase the chance of disease transmissions during the rainy summer season. An increase in local dengue incidence has been observed in Hong Kong ever since the first case in 2002, with an outbreak reaching historically high case numbers in 2018. However, the effects of seasonal climate variability on recent outbreaks are unknown. As the local cases were found to be spatially clustered, we developed a Poisson generalized linear mixed model using pre-summer monthly total rainfall and mean temperature to predict annual dengue incidence (the majority of local cases occur during or after the summer months), over the period 2002-2018 in three pre-defined areas of Hong Kong. Using leave-one-out cross-validation, 5 out of 6 observations of area-specific outbreaks during the major outbreak years 2002 and 2018 were able to be predicted. 42 out of a total of 51 observations (82.4%) were within the 95% confidence interval of the annual incidence predicted by our model. Our study found that the rainfall before and during the East Asian monsoon (pre-summer) rainy season is negatively correlated with the annual incidence in Hong Kong while the temperature is positively correlated. Hence, as mosquito control measures in Hong Kong are intensified mainly when heavy rainfalls occur during or close to summer, our study suggests that a lower-than-average intensity of pre-summer rainfall should also be taken into account as an indicator of increased dengue risk.


2020 ◽  
Author(s):  
James L. Peugh ◽  
Sarah J. Beal ◽  
Meghan E. McGrady ◽  
Michael D. Toland ◽  
Constance Mara

2020 ◽  
Vol 641 ◽  
pp. 159-175
Author(s):  
J Runnebaum ◽  
KR Tanaka ◽  
L Guan ◽  
J Cao ◽  
L O’Brien ◽  
...  

Bycatch remains a global problem in managing sustainable fisheries. A critical aspect of management is understanding the timing and spatial extent of bycatch. Fisheries management often relies on observed bycatch data, which are not always available due to a lack of reporting or observer coverage. Alternatively, analyzing the overlap in suitable habitat for the target and non-target species can provide a spatial management tool to understand where bycatch interactions are likely to occur. Potential bycatch hotspots based on suitable habitat were predicted for cusk Brosme brosme incidentally caught in the Gulf of Maine American lobster Homarus americanus fishery. Data from multiple fisheries-independent surveys were combined in a delta-generalized linear mixed model to generate spatially explicit density estimates for use in an independent habitat suitability index. The habitat suitability indices for American lobster and cusk were then compared to predict potential bycatch hotspot locations. Suitable habitat for American lobster has increased between 1980 and 2013 while suitable habitat for cusk decreased throughout most of the Gulf of Maine, except for Georges Basin and the Great South Channel. The proportion of overlap in suitable habitat varied interannually but decreased slightly in the spring and remained relatively stable in the fall over the time series. As Gulf of Maine temperatures continue to increase, the interactions between American lobster and cusk are predicted to decline as cusk habitat continues to constrict. This framework can contribute to fisheries managers’ understanding of changes in habitat overlap as climate conditions continue to change and alter where bycatch interactions could occur.


2019 ◽  
Vol 24 (2) ◽  
pp. 200-208
Author(s):  
Ravindra Arya ◽  
Francesco T. Mangano ◽  
Paul S. Horn ◽  
Sabrina K. Kaul ◽  
Serena K. Kaul ◽  
...  

OBJECTIVEThere is emerging data that adults with temporal lobe epilepsy (TLE) without a discrete lesion on brain MRI have surgical outcomes comparable to those with hippocampal sclerosis (HS). However, pediatric TLE is different from its adult counterpart. In this study, the authors investigated if the presence of a potentially epileptogenic lesion on presurgical brain MRI influences the long-term seizure outcomes after pediatric temporal lobectomy.METHODSChildren who underwent temporal lobectomy between 2007 and 2015 and had at least 1 year of seizure outcomes data were identified. These were classified into lesional and MRI-negative groups based on whether an epilepsy-protocol brain MRI showed a lesion sufficiently specific to guide surgical decisions. These patients were also categorized into pure TLE and temporal plus epilepsies based on the neurophysiological localization of the seizure-onset zone. Seizure outcomes at each follow-up visit were incorporated into a repeated-measures generalized linear mixed model (GLMM) with MRI status as a grouping variable. Clinical variables were incorporated into GLMM as covariates.RESULTSOne hundred nine patients (44 females) were included, aged 5 to 21 years, and were classified as lesional (73%), MRI negative (27%), pure TLE (56%), and temporal plus (44%). After a mean follow-up of 3.2 years (range 1.2–8.8 years), 66% of the patients were seizure free for ≥ 1 year at last follow-up. GLMM analysis revealed that lesional patients were more likely to be seizure free over the long term compared to MRI-negative patients for the overall cohort (OR 2.58, p < 0.0001) and for temporal plus epilepsies (OR 1.85, p = 0.0052). The effect of MRI lesion was not significant for pure TLE (OR 2.64, p = 0.0635). Concordance of ictal electroencephalography (OR 3.46, p < 0.0001), magnetoencephalography (OR 4.26, p < 0.0001), and later age of seizure onset (OR 1.05, p = 0.0091) were associated with a higher likelihood of seizure freedom. The most common histological findings included cortical dysplasia types 1B and 2A, HS (40% with dual pathology), and tuberous sclerosis.CONCLUSIONSA lesion on presurgical brain MRI is an important determinant of long-term seizure freedom after pediatric temporal lobectomy. Pediatric TLE is heterogeneous regarding etiologies and organization of seizure-onset zones with many patients qualifying for temporal plus nosology. The presence of an MRI lesion determined seizure outcomes in patients with temporal plus epilepsies. However, pure TLE had comparable surgical seizure outcomes for lesional and MRI-negative groups.


Sign in / Sign up

Export Citation Format

Share Document