scholarly journals Effects of Timing of Copper Sprays, Defoliation, Rainfall, and Inoculum Concentration on Incidence of Olive Knot Disease

Plant Disease ◽  
2004 ◽  
Vol 88 (2) ◽  
pp. 131-135 ◽  
Author(s):  
Beth L. Teviotdale ◽  
William H. Krueger

The olive knot pathogen, Pseudomonas savastanoi, causes galls on shoots, branches, fruit, and leaves. Shoots girdled by galls die. Any fresh wound is susceptible to infection, but the most common entry sites are leaf scars. Leaf scars are most susceptible to infection during the first 2 days after leaf fall and remain susceptible for 7 more days. Simulated leaf scars on ‘Manzanillo’ olive trees were created by removing leaves from healthy shoots at approximately monthly intervals from December through June 1997-98, 1998-99, and 1999-2000. Trees were treated with a water suspension of cupric hydroxide (Kocide DF40) at 3 g/liter one, two, or three times in 1998-99 and 1999-2000 with a hand-gun sprayer. Generally, disease control improved with more applications (P = 0.008 and 0.032 in 1999 and 2000, respectively). Disease incidence was greatest on shoots that were defoliated in March 1998, April and June 1999, and March and May 2000. Cumulative rainfall 2 and 9 days after each defoliation was recorded. Disease incidence was positively correlated (P = 0.031 and 0.023 for 2 and 9 days, respectively) with spring (March through June) but not winter (December through February) rainfall. Comparable simulated leaf scars were inoculated in December and April 1997-98 and 1998-99 with 104, 106, and 108 CFU/ml of the pathogen and treated with a water suspension of cupric hydroxide at 3 g/liter using a handheld pump sprayer. Inoculated and noninoculated, nontreated shoots were included. More disease developed in April than in December inoculations (P = <0.0001) in both years. Disease incidence increased with increasing inoculum concentration (P = <0.0001) in both years and was lower in shoots treated with Kocide DF40 (P = <0.0001). Our work demonstrated that the common grower practice of one post-harvest application of copper bactericide provides only minimal protection against olive knot, and that additional sprays in spring are needed to substantially improve disease control.

2013 ◽  
Vol 368 (1623) ◽  
pp. 20120148 ◽  
Author(s):  
Diane S. Saint-Victor ◽  
Saad B. Omer

As multiple papers within this special issue illustrate, the dynamics of disease eradication are different from disease control. When it comes to disease eradication, ‘the last mile is longest’. For social and ecological reasons such as vaccine refusal, further ending incidence of a disease when it has reached low levels is frequently complex. Issues of non-compliance within a target population often influence the outcome of disease eradication efforts. Past eradication efforts confronted such obstacles towards the tail end of the campaign, when disease incidence was lowest. This article provides a comparison of non-compliance within polio, measles and smallpox campaigns, demonstrating the tendency of vaccine refusal to rise as disease incidence falls. In order to overcome one of the most intractable challenges to eradication, future disease eradication efforts must prioritize vaccine refusal from the start, i.e. ‘walk the last mile first’.


Author(s):  
AS Shastin ◽  
VG Gazimova ◽  
OL Malykh ◽  
TS Ustyugova ◽  
TM Tsepilova

Introduction: In the context of a decreasing size of the working-age population, monitoring of the health status and disease incidence in this cohort shall be one of the most important tasks of public and occupational health professionals. Health risk management for the working population in the Russian Federation requires complete and reliable data on its morbidity, especially in view of the fact that its average age demonstrates a stable growth. It is, therefore, crucial to have precise and consistent information about the morbidity of the working-age population. Objective: The study aimed to assess incidence rates of diseases with temporary incapacity for work in the constituent entities of the Ural Federal District of the Russian Federation. Materials and methods: We reviewed data on disease incidence rates published by the Federal State Statistics Service in the Common Interdepartmental System of Statistical Information, Section 15.12, Causes of Temporary Disability, and Section 2.9.I.4, Federal Project for Public Health Promotion. The constituent entities under study were ranked according to the number of cases and days of temporary incapacity per 100 workers and E.L. Notkin scale was used to determine grade the incidence. The statistical analysis was performed using STATISTICA 10 software. Long-term average values of certain indicators, median values, standard deviation (σ) and coefficients of variation were estimated. The difference in the indices was assessed using the Mann-Whitney test. Results: Compared to 2010, incidence rates of diseases with temporary incapacity for work in the constituent entities of the Ural Federal District in 2019 demonstrated a significant decline. The sharp drop was observed in 2015. We also established that the Common Interdepartmental System of Statistical Information contains contradictory information on disease incidence. Conclusion: It is expedient to consider the issue of revising guidelines for organization of federal statistical monitoring of morbidity with temporary incapacity for work and to include this indicator in the system of public health monitoring.


2001 ◽  
Vol 28 (1) ◽  
pp. 28-33 ◽  
Author(s):  
J. P. Damicone ◽  
K. E. Jackson

Abstract Two trials with iprodione and three trials with fluazinam were conducted to assess the effects of application method and rate on the control of Sclerotinia blight of peanut with fungicide. In order to concentrate the fungicides near the crown area where the disease causes the most damage, applications were made through a canopy opener with a single nozzle centered over the row to achieve a 30.5-cm-wide band (canopy opener), and through a single nozzle centered over the row to achieve a 46-cm-wide band (band). Broadcast applications were compared to these methods at rates of 0, 0.28, 0.56, and 1.12 kg/ha on the susceptible cultivar Okrun. Sclerotinia blight was severe, with &gt; 70% disease incidence and &lt; 2000 kg/ha yield for the untreated controls in each trial. Linear reductions in area under the disease progress curve (AUDPC), but not final disease incidence, with iprodione rate were significant (P &lt; 0.05) for all methods of application. However, the rate of decrease did not differ among application methods. Linear increases in yield with rate of iprodione were greater for canopy opener compared to the band or broadcast applications. Only a 50% reduction in AUDPC and a maximum yield of &lt; 2700 kg/ha was achieved with iprodione using the best method. At the maximum rate of 1.12 kg/ha, fluazinam provided &gt; 75% disease control and &gt; 4000 kg/ha yield for all application methods. Differences in disease control and yield among application methods only occurred at the 0.28 and 0.56 kg/ha rates of fluazinam. Reductions in AUDPC with fluazinam rate were quadratic for all application methods, but AUDPC values were less for the canopy opener and band methods at 0.28 and 0.56 kg/ha compared to the broadcast methods. The yield response to rate for broadcast applications of fluazinam was linear. However, predicted yield responses to fluazinam rate were quadratic for the band and canopy opener methods and approached the maximum response at 0.84 kg/ha. Targeting fungicide applications using the band and/or canopy opener methods was beneficial for fluazinam at reduced rates. Disease control with iprodione was not adequate regardless of application method.


Plant Disease ◽  
2008 ◽  
Vol 92 (8) ◽  
pp. 1252-1252 ◽  
Author(s):  
J. Moral ◽  
R. De la Rosa ◽  
L. León ◽  
D. Barranco ◽  
T. J. Michailides ◽  
...  

Traditional olive orchards in Spain have been planted at a density of 70 to 80 trees per ha with three trunks per tree. During the last decade, the hedgerow orchard, in which planting density is approximately 2,000 trees per ha, was developed. In 2006 and 2007, we noted a severe outbreak of fruit rot in FS-17, a new cultivar from Italy, in an experimental hedgerow planting in Córdoba, southern Spain. The incidence of fruit rot in ‘FS-17’ was 80% in January of 2006 and 24% in January of 2007. Cvs. Arbosana, IRTA-i18 (a selected clone from ‘Arbequina’), and Koroneiki had no symptoms in either year of the study. Disease incidence in ‘Arbequina’ was <0.1% only in 2006. Affected fruits were soft with gray-white skin and they eventually mummified. Black-green sporodochia were observed on the surface of diseased fruits. A fungus was isolated from diseased fruits on potato dextrose agar (PDA) and incubated at 22 to 26°C with a 12-h photoperiod. After 8 days of growing on PDA, fungal colonies formed conidial chains having a main axis with up to 10 conidia and secondary and tertiary short branches with two to four conidia. Conidia were obpyriform, ovoid, or ellipsoidal, without a beak or with a short beak, had up to four transverse septa, and measured 11.7 to 24.7 (mean 19.6) μm long and 7.7 to 13.0 (mean 9.6) μm wide at the broadest part of the conidium. The length of the beak of conidia was variable, ranging from 0 to 28.6 (mean 5.5) μm. The fungus was identified as Alternaria alternata (1). Pathogenicity tests were performed by spraying 40 mature fruits of ‘FS-17’ with a spore suspension (1 × 106 spores per ml). The same number of control fruits was treated with water. After 21 days, inoculated fruit developed symptoms that had earlier been observed in the field. A. alternata was reisolated from lesions on all infected fruits. The fungus was not isolated from any of the control fruits. The experiment was performed twice. The new growing system and the high susceptibility of some olive cultivars, such as FS-17, may result in a high incidence of disease caused by a pathogen that is generally characterized as weakly virulent. To our knowledge, this is the first report of A. alternata causing a severe outbreak of fruit rot on olive trees in the field. References: (1) B. M. Pryor and T. J. Michailides. Phytopathology 92:406, 2002.


Plant Disease ◽  
2020 ◽  
Author(s):  
Madison Stahr ◽  
Lina Quesada-Ocampo

Black rot, caused by Ceratocystis fimbriata, is a devastating postharvest disease of sweetpotato that recently re-emerged in 2014. Although the disease is known to develop in storage and during export to overseas markets, little is known as to how pathogen dispersal occurs. This study was designed to investigate dump tank water as a means of dispersal through four different types of water treatments: inoculum concentration (0, 5, 5 × 101, 5 × 102, and 5 × 103 spores/ml), inoculum age (0, 24, 48, 96, and 144 h), water temperature (10, 23, 35, and 45˚C), and presence of a water sanitizer (DryTec, Sanidate, FruitGard, and Selectrocide). Wounded and non-wounded sweetpotato storage roots were soaked in each water treatment for 20-min, stored at 29˚C for a 14-day period, and rated for disease incidence every other day. Disease was observed in sweetpotato storage roots in all water treatments tested, except in the negative controls. Disease incidence decreased with both inoculum concentration and inoculum age, yet values of 16.26% and up to 50% were observed for roots exposed to 5 spores/ml and 144 h water treatments, respectively. Sanitizer products that contained a form of chlorine as the active ingredient significantly reduced disease incidence in storage roots when compared to control roots and roots exposed to a hydrogen-peroxide based product. Finally, no significant differences in final incidence were detected in wounded sweetpotato storage roots exposed to water treatments of any temperature, but a significant reduction in disease progression was observed in the 45˚C treatment. These findings indicate that if packing line dump tanks are improperly managed, they can aid C. fimbriata dispersal through the build-up of inoculum as infected roots are unknowingly washed after storage. Chlorine-based sanitizers can reduce infection when applied after root washing and not in the presence of high organic matter typically found in dump tanks.


Author(s):  
Samuël Coghe

Disease control and public health have been key aspects of social and political life in sub-Saharan Africa since time immemorial. With variations across space and time, many societies viewed disease as the result of imbalances in persons and societies and combined the use of materia medica from the natural world, spiritual divination, and community healing to redress these imbalances. While early encounters between African and European healing systems were still marked by mutual exchanges and adaptations, the emergence of European germ theory-based biomedicine and the establishment of racialized colonial states in the 19th century increasingly challenged the value of African therapeutic practices for disease control on the continent. Initially, colonial states focused on preserving the health of European soldiers, administrators, and settlers, who were deemed particularly vulnerable to tropical climate and its diseases. Around 1900, however, they started paying more attention to diseases among Africans, whose health and population growth were now deemed crucial for economic development and the legitimacy of colonial rule. Fueled by new insights and techniques provided by tropical medicine, antisleeping sickness campaigns would be among the first major interventions. After World War I, colonial health services expanded their campaigns against epidemic diseases, but also engaged with broader public health approaches that addressed reproductive problems and the social determinants of both disease and health. Colonial states were not the only providers of biomedical healthcare in colonial Africa. Missionary societies and private companies had their own health services, with particular logics, methods, and focuses. And after 1945, international organizations such as the World Health Organization (WHO) and the United Nations Children’s Fund (UNICEF) increasingly invested in health campaigns in Africa as well. Moreover, Africans actively participated in colonial disease control, most notably as nurses, midwives, and doctors. Nevertheless, Western biomedicine never gained hegemony in colonial Africa. Many Africans tried to avoid or minimize participation in certain campaigns or continued to utilize the services of local healers and diviners, often in combination with particular biomedical approaches. To what extent colonial disease control impacted on disease incidence and demography is still controversially debated.


Author(s):  
Chandar Kala ◽  
S. Gangopadhyay ◽  
S. L. Godara

Antagonistic potentiality of Trichoderma viride, T. harzianum and Pseudomonas fluorescens were evaluated against Fusarium oxysporum f. sp. ciceri under in vivo conditions. The effect of organic amendments viz; farm yard manure, vermicompost and mustard cake on disease control potentiality of test antagonists against chickpea wilt and on population dynamics of the antagonists and pathogen in soil was also studied. Maximum inhibition of mycelial growth of F. o. f. sp. ciceri was recorded in presence of P. fluorescens (%) followed by T. harzianum (%) and T. viride (%). Seed treatment with P. fluorescens was more effective in suppressing the disease incidence as compared to T. harzianum and T. viride. The disease control efficacy and population dynamics of all the three test antagonists was enhanced in response to application of organic amendments. Among the three organic amendments tested, mustard cake was most effective in enhancing the disease control potentiality of these antagonists.


2009 ◽  
Vol 27 (15_suppl) ◽  
pp. 7544-7544
Author(s):  
S. Schild ◽  
D. Graham ◽  
S. Hillman ◽  
S. Vora ◽  
G. Yolanda ◽  
...  

7544 Background: NCCTG N0028 was a trial that determined the MTD of RT that could be given with carboplatin & paclitaxel was 74 Gy/34 fractions. This secondary analysis was performed to determine the survival of pts treated on this trial. Methods: Eligible pts had medically or surgically unresectable NSCLC, PS=0–1, weight loss <10% in the prior 3 months(mo), no prior therapy, adequate laboratory & pulmonary functions. Included were 25 pts with clinical stages I (4pts), II (1 pt), IIIa (12 pts), & IIIb (8 pts). Treatment included: weekly I.V. paclitaxel (50mg/m2) & carboplatin (AUC=2) during RT. The RT included 2 Gy daily to an initial dose of 70 Gy. The total dose was increased in 4 Gy increments until the MTD was determined. RT was delivered with 3-D treatment planning but no elective nodal RT. Three pts received 70 Gy, 18 pts received 74 Gy, & 4 pts received 78Gy. Results: Pts were followed until death or from 10–67 mo (median: 28mo) in those alive at last evaluation. The median survival (MS) of the entire cohort was 42mo. The 5 stages I-II pts had a MS of 53 mo & the 20 stage III pts had MS of 42mo. Conclusions: Standard dose RT is unable to sterilize disease in the majority of pts with unresectable NSCLC. While the addition of chemotherapy has significantly improved survival of these pts, the MS is generally 15–24 mo. These preliminary results suggest higher than standard doses of RT may improve disease control & prolong survival. A phase III trial comparing standard-dose RT(60Gy) to high-dose RT (74Gy) is open and should more definitively address the issue of RT dose with concurrent chemotherapy for unresectable NSCLC. Future technological improvements in imaging & targeting will provide methods to safely administer even greater RT doses which will likely further improve disease control. No significant financial relationships to disclose.


Sign in / Sign up

Export Citation Format

Share Document