scholarly journals The relationship between SPAD chlorophyll and disease severity index in Ganoderma-infected oil palm seedlings

2019 ◽  
Vol 17 (3) ◽  
pp. 355-358 ◽  
Author(s):  
M.R.M. Rakib ◽  
A.H. Borhan ◽  
A.N. Jawahir

Establishment of disease in oil palm seedlings through artificial inoculation of Ganoderma are widely used for studies of various aspects of plant pathology, including epidemiology, etiology, disease resistance, host-parasite interaction and disease control. The estimation of chlorophyll content in the infected seedlings possibly could provide a good indicator for degree of disease or infection, and changes during pathogenesis. Thus, the objective of this study was to evaluate the relationship between disease severity index (DSI) and chlorophyll content in Ganoderma infected oil palm seedlings. Three-month-old oil palm seedlings were infected with Ganoderma inoculum on rubber wood block (RWB), where 44 isolates of Ganoderma were tested. Disease severity index (DSI) and chlorophyll content using a single-photon avalanche diode (SPAD) meter were recorded at 4 weeks interval for a period of 24 weeks after inoculation (WAI). Pearson's correlation analysis and regression analysis were performed to evaluate the relationship between the variables. It was found that the relationship between DSI and SPAD chlorophyll value was inversely proportional (R = -0.92) in a linear trend (R2 = 0.85). Furthermore, the increasing trend of the DSI across the weeks were fitted in a quadratic model (R2 = 0.99). In contrast, the SPAD chlorophyll value declined in a linear trend (R2 = 0.98). The SPAD chlorophyll value could be considered as a better alternative over the DSI as the SPAD chlorophyll value was strongly related to DSI, as well as able to detect physiological changes in the infected oil palm seedlings at the early stages of pathogenesis. J Bangladesh Agril Univ 17(3): 355–358, 2019

HortScience ◽  
1992 ◽  
Vol 27 (6) ◽  
pp. 607b-607
Author(s):  
W. Tietjen ◽  
P.J. Nitzsche ◽  
W.P. Cowgill ◽  
M.H. Maletta ◽  
S.A. Johnston

`Market Prize' and `Bravo' cabbage (Brassica oleracea Var. capitata L.), transplanted as peat plug and bareroot plants into a field naturally infested with Plasmodiophora brassicae, Woronin, were treated immediately after planting with a liquid or a granular surfactant. APSA 80™, applied in transplant water, significantly reduced percent clubbing and disease severity index (DSI) compared to control treatments. Miller Soil Surfactant Granular™ did not significantly reduce percent clubbing or DSI. There was a significant effect of cultivar on percent clubbing and DSI. There was no significant effect of transplant type on percent clubbing or DSI. This year's study culminates five years of investigation of surfactants for clubroot control. Specific surfactants have proven to be an effective control of clubroot in cabbage. Chemical names used: nonylphenoxypolyethoxyethanol (APSA 80™); alpha-alkanoic-hydro omega-hydroxy poly (oxyethylene) (Miller Soil Surfactant Granular™).


2014 ◽  
Vol 104 (4) ◽  
pp. 387-395 ◽  
Author(s):  
Jay Ram Lamichhane ◽  
Alfredo Fabi ◽  
Leonardo Varvaro

Cytospora canker, caused by the fungus Cytospora corylicola, is present in hazelnut production areas worldwide. The disease is widespread throughout the main production areas of Italy. The causal agent is considered to be a secondary invader of damaged tissue that attacks mainly stressed plants. However, little is known of disease severity and stress factors that predispose plants to infection. In particular, the role of pedoclimatic factors was investigated. Direct survey indicated that disease severity varied across several study sites. Geostatistics showed a strong positive correlation between disease severity index and summer heat (r = 0.80 and 0.91 for July and August, respectively) and strong negative correlation between disease severity index and soil organic matter (r = –0.78). A moderate positive correlation between disease severity index and magnesium/potassium ratio (r = 0.58) and moderate negative correlations between disease severity index and total soil nitrogen (r = –0.53), thermal shock (r = –0.46), and rainfall (r = –0.53) were determined. No significant correlation between disease severity index and soil aluminum (r = –0.35), soil pH (r = –0.01), and plant age (r = –0.38) was found.


2008 ◽  
Vol 6 (1) ◽  
pp. 22-32 ◽  
Author(s):  
Axel Diederichsen ◽  
Tatiana A. Rozhmina ◽  
Ljudmilla P. Kudrjavceva

Germplasm of 153 flax (Linum usitatissimum) accessions from 24 countries held at Plant Gene Resources of Canada (PGRC) was evaluated for resistance to fusarium wilt (Fusarium oxysporum), anthracnose (Colletotrichum lini) and pasmo (Septoria linicola). The screening was conducted at the All-Russian Flax Research Institute (VNIIL) at Torzhok, Russia, over 3 years for fusarium wilt and anthracnose, and over 2 years for pasmo. A disease severity index ranging from 0% (no infection) to 100% (heavy infection) was calculated based on observations after artificial inoculation with the pathogens in the greenhouse (fusarium wilt) or in field nurseries (anthracnose and pasmo). The average disease severity index for fusarium wilt was 56.6 ± 34.4% (range 0–100.0%), for anthracnose 59.8 ± 8.1% (range 43.8–83.9%) and for pasmo 74.2 ± 11.8% (range 27.3–100.0%). The variation of disease severity indices among the years and within each accession was highest for fusarium wilt. Higher than average resistance for all three diseases was found in accessions from East Asia, while germplasm from the Indian subcontinent showed considerably lower than average resistance. Germplasm from North America and South America (mostly linseed) displayed above average resistance to fusarium wilt, while European accessions (mostly fibre flax) showed lower than average resistance to this disease. The different resistance levels reflected the improvements made by plant breeding and differences in the environments under which the germplasm accessions evolved. Accessions with potential use in linseed and fibre flax breeding were identified.


2018 ◽  
Vol 108 (4) ◽  
pp. 469-478 ◽  
Author(s):  
Mamadou L. Fall ◽  
John F. Boyse ◽  
Dechun Wang ◽  
Jaime F. Willbur ◽  
Damon L. Smith ◽  
...  

Sclerotinia sclerotiorum is a significant threat to soybean production worldwide. In this study, an epidemiological approach was used to examine 11 years of historical data from a soybean management performance trial in order to advance our understanding of Sclerotinia stem rot (SSR) development and to identify environmental predictors of SSR epidemics and associated yield losses. Recursive partitioning analysis suggested that average air temperature and total precipitation in July were the most significant variables associated with disease severity. High levels of SSR disease severity index were observed when the average temperature in July was below 19.5°C and total precipitation in July was moderate, between 20 and 108.5 mm. A biphasic sigmoidal curve accurately described the relationship between SSR disease severity index (DSI) and yield, with a DSI threshold of 22, below which minimal yield loss was observed. A 10% increase in the DSI, from 22.0 to 24.2, led to an 11% decrease in yield, from 3,308.14 to 2,951.29 kg/ha. Also, a yield threshold (3,353 kg/ha) that was higher than the annual U.S. average soybean yield (3,039.7 kg/ha) was suggested as an expected yield under low or no SSR pressure in the U.S. Midwest. These thresholds can allow soybean stakeholders to assess the value of disease control and establish an SSR baseline for cost-effective management to protect yields. Because S. sclerotiorum has more than 400 plant host species, and because having quantitative information concerning crop losses is crucial for decision making, this study shows the usefulness of historical data on SSR and, hence, can serve as a model in other SSR pathosystems (canola, dry bean, potato, pea, and so on).


2002 ◽  
Vol 139 (1) ◽  
pp. 47-53 ◽  
Author(s):  
M. A. DI RENZO ◽  
N. C. BONAMICO ◽  
D. D. DÍAZ ◽  
J. C. SALERNO ◽  
M. M. IBAÑEZ ◽  
...  

No genetic estimates for resistance to Mal de Río Cuarto (MRC) disease in Zea mays (L.) are currently available in the literature. Therefore, the objectives of this investigation were (i) to estimate the variance and heritability of partial resistance to MRC disease and of other agronomic traits from maize families and (ii) to examine associations among MRC disease severity values across different environments and between MRC and other agronomic traits. These estimations, obtained in an endemic area, could contribute to the design of efficient enhancement programmes and evaluation activity for the improvement of MRC resistance. The research was conducted by testing 227 F3 derived-lines from a cross between a susceptible dent line, Mo17, and a partially resistant flint line, BLS14, for MRC disease at two Río Cuarto locations in each of 2 years. The resistance of the lines, measured with a disease severity index (DSI), was normally distributed across environments. Genotypic variances were highly significant on all scoring environments. Estimates of genotype–environment interaction were also significant, suggesting that certain genotypes have little stability over different environments. For disease severity index all estimates demonstrated moderate heritabilities ranging from 0.44 to 0.56 and were similar when based on individual environments or across environment. Confidence interval widths ranged from 34.88 to 50.30% as large as the heritability point estimate. The correlations between environments were small enough to indicate that families did not rank similarly in individual environments for MRC resistance. Disease severity index correlated significantly (P<0.01) with plant height, leaf surface, leaf border, leaf length and tassel type. Heritability estimates for plant height and tassel type were 0.48 and 0.38 respectively and for the various leaf traits heritability values were very low. On the basis of the substantial genotype–environment interaction and the little association between DSI values in the different environments, selection for an increased resistance to MRC disease would require evaluation of germplasm across multiple years and locations. Tassel type would be a useful predictor of DSI and can be used effectively to improve screening procedures.


2012 ◽  
Vol 63 (4) ◽  
pp. 351 ◽  
Author(s):  
B. H. Yu ◽  
Z. B. Nan ◽  
Y. Z. Li ◽  
H. L. Lin

Yellow stunt and root rot caused by Embellisia astragali are major factors contributing to declining yields of standing milkvetch (Astragalus adsurgens). The resistance of ten varieties of standing milkvetch to E. astragali was evaluated under laboratory, greenhouse, and field conditions. Seed germination/emergence, shoot and root length, plant dry weight, disease incidence, mortality, and disease severity index were monitored. The results show that Shanxi and Zhongsha No. 1 varieties had the best agronomic traits and lowest levels of disease in all experiments, while the varieties Neimeng and Ningxia had the highest susceptibility to disease. Germination/emergence differed significantly (P < 0.05) between varieties after inoculation, and compared with the control, germination/emergence of inoculated treatments of nine varieties decreased on average by 1.5% in laboratory experiments and by 4.1% in greenhouse experiments at 15 days after inoculation. Inoculation reduced shoot length by an average of 24.4% and 41.5% (P < 0.05) in laboratory and greenhouse experiments, respectively, in six of ten varieties. All varieties showed significantly (P < 0.05) lower plant dry weight following inoculation, with reductions ranging from 0.3 to 0.6 mg in the laboratory and from 82.6 to 149.4 mg in the greenhouse. Resistance to the pathogen was evaluated on the basis of disease incidence, a disease severity index (DSI), and mortality; varieties showing different resistance were grouped using cluster analysis. There were significant correlations between the results of laboratory and greenhouse experiments (r = 0.79; P < 0.01) and between greenhouse and field experiments (r = 0.83; P < 0.01) across all varieties. Multiple regression analysis between laboratory/greenhouse and field experiments on DSI suggested that screening in the laboratory/greenhouse could be an alternative method of rapidly estimating DSI under field conditions.


2004 ◽  
Vol 14 (2) ◽  
pp. 240-242 ◽  
Author(s):  
Steve Rose ◽  
Zamir K. Punja

Eighteen cucumber (Cucumis sativus L.) cultivars (long English type) were screened for their susceptibility to fusarium root and stem rot caused by Fusarium oxysporum Schlechtend.: Fr. f.sp. radicis-cucumerinum D.J. Vakalounakis using seedlings at the third true-leaf stage. Roots were trimmed and dipped into a spore suspension (105 spores/mL) of the pathogen and the plants were re-potted. A disease severity index (DSI) was used to assess disease responses 4 or 8 weeks later based on plant mortality and the height of surviving plants compared to the noninoculated controls. `Sienna', `Amazing' and `Dominica' were most susceptible to infection and the resulting DSI values were significantly (P ≤ 0.05) higher compared to noninoculated control plants. The cultivars `Korinda', `Euphoria' and `Aviance' displayed significantly lower DSI values which were not significantly different from noninoculated control plants. The remaining 12 cultivars displayed DSI values which were intermediate between the above two classes of responses. The results from this study indicate there is the potential to identify and develop cultivars and breeding lines of greenhouse cucumbers with enhanced resistance to fusarium root and stem rot.


2005 ◽  
Vol 73 (2) ◽  
pp. 61-68 ◽  
Author(s):  
G. Xue ◽  
R. Hall

The effects of surface wetness duration, temperature, and inoculum concentration on development of scald in winter barley (Hordeum vulgare) inoculated with race SOI of Rhynchosporium secalisfrom southern Ontario, Canada were examined. On barley line 'GW8614' sprayed with a spore suspension (2 x 105 conidia ml-1), wet periods of 2-48 h and constant temperatures of 10-25°C during the wet and dry periods, 10-25°C during the wet period and 20°C during the dry period, or 20°C during the wet period and 10-30°C during the dry period allowed scald to develop 8.3-11.5 d after inoculation. The disease developed most rapidly and most severely when the wet period after inoculation was 48 h and the temperature of the wet period and subsequent dry period was 20°C. Scald did not develop within 14 d following temperatures of 30°C during the wet period or of 5°C during the wet or dry periods. At inoculum densities of 102-106 conidia ml-1, the disease severity index values (0-100 scale) increased from 53 to 100 in line 'GW8614' and from 0 to 90 in cultivar OAC Acton and the latent periods decreased from 13.3 to 7.8 d in line 'GW8614' and from more than 14 to 8.5 d in cv. OAC Acton. This information should facilitate screening of barley for resistance to scald.


2017 ◽  
Vol 7 (5) ◽  
pp. 266-269 ◽  
Author(s):  
Fariha Kanwal ◽  
◽  
Changrui Lu ◽  
Ishtiaq Qadri ◽  
Muhammad Sohail ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document