scholarly journals An integrated strategy for managing Grapevine leafroll-associated virus 3 in red berry cultivars in New Zealand vineyards

2021 ◽  
Author(s):  
◽  
Vaughn Antony Bell

<p>To sustain growth and revenue projections, the New Zealand wine sector aims to produce premium quality wine to supply lucrative export markets. In grapevines, however, the presence of virus and virus-like diseases can negatively influence qualitative parameters of wine production. Where such risks are identified, sustainable remediation protocols should be developed. One risk factor is Grapevine leafroll-associated virus 3 (GLRaV-3), an economically important virus of Vitis. In this thesis, I develop components of an integrated management plan with the aim of reducing and sustaining GLRaV-3 incidence at <1%.  In Hawke’s Bay vineyard study blocks, three aspects related to GLRaV-3 management were explored between 2008 and 2013: Firstly, herbicide-treated vines and/or land left fallow after removing infected vines may mitigate the effects of GLRaV-3. Historically though, vine root removal was not well implemented, meaning persistent roots may be long term reservoirs for GLRaV-3. I tested the virus reservoir hypothesis in vineyard blocks where virus incidence of ≥95% necessitated removing all vines. Enzyme-linked immunosorbent assay (ELISA) and/or real-time polymerase chain reaction (real-time PCR) detected GLRaV-3 in most remnant root samples tested, independent of the herbicide active ingredient applied (glyphosate, triclopyr, or metsulfuron) or the fallow duration (6 months to 4 years). On some virus-positive root samples, the GLRaV-3 mealybug vector, Pseudococcus calceolariae, was found, and after real-time PCR testing, virus was detected in some mealybugs. Thus, without effective vine removal, unmanaged sources of virus inoculum and viruliferous vectors could pose a risk to the health of replacement vines.  Secondly, in most red berry cultivars, GLRaV-3 is characterised by dark red downward curling leaves with green veins. With visual diagnostics predicted to be a reliable identifier of GLRaV-3-symptomatic red berry vines, early identification could support a cost-effective and sustainable virus management plan. In blocks planted in Merlot, Cabernet Sauvignon, Syrah, and Malbec vines, the reliability of visual symptom identification was compared with ELISA. In terms of sensitivity (binomial generalised linear model, 0.966) and specificity (0.998), late-season visual diagnostics reliably predicted virus infection. Moreover, accuracy appeared unaffected by the genetically divergent GLRaV-3 populations detected in Hawke’s Bay.  Thirdly, by acting to visually identify and remove (rogue) symptomatic vines when GLRaV-3 incidence is low (<20%), an epidemic may be averted. In this ongoing study, an integrated approach to virus management was adopted in 13 well established Hawke’s Bay vineyard study blocks. All were planted in vines from one of five red berry cultivars. When monitoring commenced in 2009, all symptomatic vines visually identified (n=2,544 or 12%) were rogued. Thereafter, integrating visual diagnostics with roguing reduced virus incidence so that by 2013, just 434 (2.0%) vines were identified with virus symptoms. Annual monitoring revealed within-row vines immediately either side of an infected vine were most at risk of vector mediated virus transmission, although by 2013, just 4% of these vines had virus symptoms. Hence, roguing symptomatic vines only was recommended. In individual study blocks in 2013, virus management was tracking positively in four blocks; while in another four, results were inconclusive. In the remaining five blocks, contrasting but definitive results were evident. In three of those blocks, mean virus incidence of 10% in 2009 was sustained at ≤0.3% within 2-3 years of roguing commencing; in the other two blocks, mean incidence was 12% but cumulative vine losses of 37% (2011) and 46% (2013) culminated in roguing being replaced with whole block removal. In all five blocks, roguing protocols were standardised but in those with effective virus control, mealybug numbers were significantly lower in all years (mean: <0.2 per vine leaf; p≤0.036) relative to those where all vines were removed (mean: 0.4-2.3 per vine leaf).  Overall, the results of this research suggest that rather than adopting a single management tactic in isolation, effective GLRaV-3 control instead requires an integrated plan to be implemented annually.</p>

2021 ◽  
Author(s):  
◽  
Vaughn Antony Bell

<p>To sustain growth and revenue projections, the New Zealand wine sector aims to produce premium quality wine to supply lucrative export markets. In grapevines, however, the presence of virus and virus-like diseases can negatively influence qualitative parameters of wine production. Where such risks are identified, sustainable remediation protocols should be developed. One risk factor is Grapevine leafroll-associated virus 3 (GLRaV-3), an economically important virus of Vitis. In this thesis, I develop components of an integrated management plan with the aim of reducing and sustaining GLRaV-3 incidence at <1%.  In Hawke’s Bay vineyard study blocks, three aspects related to GLRaV-3 management were explored between 2008 and 2013: Firstly, herbicide-treated vines and/or land left fallow after removing infected vines may mitigate the effects of GLRaV-3. Historically though, vine root removal was not well implemented, meaning persistent roots may be long term reservoirs for GLRaV-3. I tested the virus reservoir hypothesis in vineyard blocks where virus incidence of ≥95% necessitated removing all vines. Enzyme-linked immunosorbent assay (ELISA) and/or real-time polymerase chain reaction (real-time PCR) detected GLRaV-3 in most remnant root samples tested, independent of the herbicide active ingredient applied (glyphosate, triclopyr, or metsulfuron) or the fallow duration (6 months to 4 years). On some virus-positive root samples, the GLRaV-3 mealybug vector, Pseudococcus calceolariae, was found, and after real-time PCR testing, virus was detected in some mealybugs. Thus, without effective vine removal, unmanaged sources of virus inoculum and viruliferous vectors could pose a risk to the health of replacement vines.  Secondly, in most red berry cultivars, GLRaV-3 is characterised by dark red downward curling leaves with green veins. With visual diagnostics predicted to be a reliable identifier of GLRaV-3-symptomatic red berry vines, early identification could support a cost-effective and sustainable virus management plan. In blocks planted in Merlot, Cabernet Sauvignon, Syrah, and Malbec vines, the reliability of visual symptom identification was compared with ELISA. In terms of sensitivity (binomial generalised linear model, 0.966) and specificity (0.998), late-season visual diagnostics reliably predicted virus infection. Moreover, accuracy appeared unaffected by the genetically divergent GLRaV-3 populations detected in Hawke’s Bay.  Thirdly, by acting to visually identify and remove (rogue) symptomatic vines when GLRaV-3 incidence is low (<20%), an epidemic may be averted. In this ongoing study, an integrated approach to virus management was adopted in 13 well established Hawke’s Bay vineyard study blocks. All were planted in vines from one of five red berry cultivars. When monitoring commenced in 2009, all symptomatic vines visually identified (n=2,544 or 12%) were rogued. Thereafter, integrating visual diagnostics with roguing reduced virus incidence so that by 2013, just 434 (2.0%) vines were identified with virus symptoms. Annual monitoring revealed within-row vines immediately either side of an infected vine were most at risk of vector mediated virus transmission, although by 2013, just 4% of these vines had virus symptoms. Hence, roguing symptomatic vines only was recommended. In individual study blocks in 2013, virus management was tracking positively in four blocks; while in another four, results were inconclusive. In the remaining five blocks, contrasting but definitive results were evident. In three of those blocks, mean virus incidence of 10% in 2009 was sustained at ≤0.3% within 2-3 years of roguing commencing; in the other two blocks, mean incidence was 12% but cumulative vine losses of 37% (2011) and 46% (2013) culminated in roguing being replaced with whole block removal. In all five blocks, roguing protocols were standardised but in those with effective virus control, mealybug numbers were significantly lower in all years (mean: <0.2 per vine leaf; p≤0.036) relative to those where all vines were removed (mean: 0.4-2.3 per vine leaf).  Overall, the results of this research suggest that rather than adopting a single management tactic in isolation, effective GLRaV-3 control instead requires an integrated plan to be implemented annually.</p>


2005 ◽  
Vol 71 (12) ◽  
pp. 8954-8957 ◽  
Author(s):  
Ezekiel T. Neeley ◽  
Trevor G. Phister ◽  
David A. Mills

ABSTRACT Oenococcus oeni is often employed to perform the malolactic fermentation in wine production, while nonoenococcal lactic acid bacteria often contribute to wine spoilage. Two real-time PCR assays were developed to enumerate the total, and nonoenococcal, lactic acid bacterial populations in wine. Used together, these assays can assess the spoilage risk of juice or wine from lactic acid bacteria.


2014 ◽  
Vol 35 (6) ◽  
pp. 667-673 ◽  
Author(s):  
Hoonmo L. Koo ◽  
John N. Van ◽  
Meina Zhao ◽  
Xunyan Ye ◽  
Paula A. Revell ◽  
...  

Objective.To evaluate the accuracy of real-time polymerase chain reaction (PCR) for Clostridium difficile–associated disease (CDAD) detection, after hospital CDAD rates significantly increased following real-time PCR initiation for CDAD diagnosis.Design.Hospital-wide surveillance study following examination of CDAD incidence density rates by interrupted time series design.Setting.Large university-based hospital.Participants.Hospitalized adult patients.Methods.CDAD rates were compared before and after real-time PCR implementation in a university hospital and in the absence of physician and infection control practice changes. After real-time PCR introduction, all hospitalized adult patients were screened for C. difficile by testing a fecal specimen by real-time PCR, toxin enzyme-linked immunosorbent assay, and toxigenic culture.Results.CDAD hospital rates significantly increased after changing from cell culture cytotoxicity assay to a real-time PCR assay. One hundred ninety-nine hospitalized subjects were enrolled, and 101 fecal specimens were collected. C. difficile was detected in 18 subjects (18%), including 5 subjects (28%) with either definite or probable CDAD and 13 patients (72%) with asymptomatic C. difficile colonization.Conclusions.The majority of healthcare-associated diarrhea is not attributable to CDAD, and the prevalence of asymptomatic C. difficile colonization exceeds CDAD rates in healthcare facilities. PCR detection of asymptomatic C. difficile colonization among patients with non-CDAD diarrhea may be contributing to rising CDAD rates and a significant number of CDAD false positives. PCR may be useful for CDAD screening, but further study is needed to guide interpretation of PCR detection of C. difficile and the value of confirmatory tests. A gold standard CDAD diagnostic assay is needed.Infect Control Hosp Epidemiol 2014;35(6):667–673


2012 ◽  
Vol 11 (2) ◽  
pp. 1
Author(s):  
B. A. Jarullah, J. Aed Gati, and A. Saleh

The current study was conducted to investigate the prevalence of BVD virus in Basrah and Nassirya city by using ELISA and RT-PCR. Two hundreds and eighty two samples of non vaccinated cattle sera samples collected from two regions of Iraq (188 samples from Nassirya city and 92 samples from Basrah city). Samples tested by Enzyme Linked Immunosorbent Assay (ELISA) antigen capture. Positive results were 20 samples ( 8 sample in Thi-Qar and 12 positive samples from Basrah). All samples submitted to indirect ELISA(IDEXX HerdCheck ELISA )for detect BVDV antibodies .Genotyping of all 20 positive samples to antigen detection were tested by Real time PCR, using Cador BVDV ½ kit, after extraction of virus RNA by QIAamp mini kit. The results revealed that there were 20 positive sample according to direct ELISA(Ag detection), while 66 sample were positive to indirect ELISA, as well as, the result of RT-PCR showed that there were two sample positive to BVDV type-1 (one sample form each city).Key words: BVDV, Genotype, ELISA, Iraq, Real time PCR.


2021 ◽  
Vol 28 (3) ◽  
pp. 187-194
Author(s):  
Rodica Sturza ◽  
◽  
Valentin Mitin ◽  
Irina Mitina ◽  
Dan Zgardan ◽  
...  

Agro-industrial waste management is an important problem of modern society as agriculture and food industry are important sources of waste. Wine production generates a considerable amount of winemaking waste (grape marc). Grape marc can be a source of natural dyes, antioxidants and could have various applications, if it is confirmed that it does not contain technogenic contaminants or unwanted microorganisms, for example, producers of mycotoxins. The paper developed the Real -Time Polymerase Chain Reaction (Real-Time PCR) methodology for testing the presence of potentially mycotoxogenic fungal species capable of producing ochratoxin A (OTA), which could be applied before grape marc processing. Based on the non-ribosomal peptide sequence of OTA, involved in ochratoxin biosynthesis, the primers have been developed for the detection of microorganisms potentially capable of producing ochratoxin A.


2005 ◽  
Vol 12 (11) ◽  
pp. 1322-1327 ◽  
Author(s):  
Jennifer M. Scotter ◽  
Stephen T. Chambers

ABSTRACT The performance of different in vitro diagnostic tests for the diagnosis of invasive aspergillosis (IA) was investigated in a transiently neutropenic rat model. Rats were immunosuppressed with cyclophosphamide and then inoculated intravenously with 1.5 × 104 CFU Aspergillus fumigatus spores. Animals were then either treated with caspofungin acetate, 1 mg/kg/day for 7 days, or not treated. PCR-enzyme-linked immunosorbent assay (ELISA), real-time PCR, and galactomannan (GM) detection were performed on postmortem blood samples, along with culture of liver, lung, and kidney homogenate. Caspofungin-treated animals showed a decrease in residual tissue burden of  A. fumigatus from organ homogenate compared to untreated animals (P < 0.002). PCR-ELISA returned positive results for 11/17 animals treated with antifungal agents and for 10/17 untreated animals. Galactomannan was positive in 8/17 caspofungin-treated animals and 4/17 untreated animals. Real-time PCR was positive in 2/17 treated and 3/17 untreated animals. This study demonstrates that PCR-ELISA is a more sensitive test than either GM detection (P = 0.052) or real-time PCR (P < 0.01) for diagnosis of IA but that any of the three tests may return false-negative results in cases of histologically proven disease. Galactomannan indices from animals treated with antifungal agents showed a trend (P = 0.1) towards higher levels than those of untreated animals, but no effect was observed with PCR-ELISA indices (P = 0.29). GM detection, as previously described, may be enhanced by the administration of caspofungin, but PCR-ELISA appears not to be affected in the same way. We conclude that PCR-ELISA is a more sensitive and reliable method for laboratory diagnosis of IA.


Plant Disease ◽  
2009 ◽  
Vol 93 (6) ◽  
pp. 649-659 ◽  
Author(s):  
Neil C. Gudmestad ◽  
Ipsita Mallik ◽  
Julie S. Pasche ◽  
Nolan R. Anderson ◽  
Kasia Kinzer

Clavibacter michiganensis subsp. sepedonicus, causal agent of bacterial ring rot (BRR) of potato (Solanum tuberosum), is a globally important quarantine pathogen that is managed in North America using zero tolerance regulations in the certified seed industry. C. michiganensis subsp. sepedonicus is well documented to cause symptomless infections in potato, contributing to its persistence in certified seed stocks. Reliable laboratory methods to detect symptomless infections with a high degree of sensitivity could assist in the reduction of inoculum in certified seed potato stocks. A real-time polymerase chain reaction (PCR) assay was developed using the cellulase A (CelA) gene sequence as the basis for primer design. CelA primers were specific to C. michiganensis subsp. sepedonicus grown in vitro and did not detect any other coryneform bacteria or potato pathogenic bacteria but did detect 69 strains of C. michiganensis subsp. sepedonicus. The CelA real-time PCR assay was more sensitive than immunofluorescence (IFA) and Cms50/72a PCR assays in detecting C. michiganensis subsp. sepedonicus in infected potato tuber cores blended with healthy tuber cores in simulated seed lot contamination experiments. CelA primers detected nonmucoid and mucoid strains with equivalent sensitivity. In naturally infected seed lots, CelA PCR primers also were more sensitive in detecting symptomless infections of C. michiganensis subsp. sepedonicus in seed tubers prior to planting compared to Cms50/72a PCR primers, IFA, and enzyme-linked immunosorbent assay. A real-time PCR format using the newly developed CelA primers proved to be a very robust detection tool for C. michiganensis subsp. sepedonicus with the added advantage of detecting only virulent strains of the ring rot bacterium.


2009 ◽  
Vol 75 (19) ◽  
pp. 6331-6339 ◽  
Author(s):  
Amanda B. Herzog ◽  
S. Devin McLennan ◽  
Alok K. Pandey ◽  
Charles P. Gerba ◽  
Charles N. Haas ◽  
...  

ABSTRACT Used for decades for biological warfare, Bacillus anthracis (category A agent) has proven to be highly stable and lethal. Quantitative risk assessment modeling requires descriptive statistics of the limit of detection to assist in defining the exposure. Furthermore, the sensitivities of various detection methods in environmental matrices are vital information for first responders. A literature review of peer-reviewed journal articles related to methods for detection of B. anthracis was undertaken. Articles focused on the development or evaluation of various detection approaches, such as PCR, real-time PCR, immunoassay, etc. Real-time PCR and PCR were the most sensitive methods for the detection of B. anthracis, with median instrument limits of detection of 430 and 440 cells/ml, respectively. There were very few peer-reviewed articles on the detection methods for B. anthracis in the environment. The most sensitive limits of detection for the environmental samples were 0.1 CFU/g for soil using PCR-enzyme-linked immunosorbent assay (ELISA), 17 CFU/liter for air using an ELISA-biochip system, 1 CFU/liter for water using cultivation, and 1 CFU/cm2 for stainless steel fomites using cultivation. An exponential dose-response model for the inhalation of B. anthracis estimates of risk at concentrations equal to the environmental limit of detection determined the probability of death if untreated to be as high as 0.520. Though more data on the environmental limit of detection would improve the assumptions made for the risk assessment, this study's quantification of the risk posed by current limitations in the knowledge of detection methods should be considered when employing those methods in environmental monitoring and cleanup strategies.


2005 ◽  
Vol 79 (22) ◽  
pp. 13924-13933 ◽  
Author(s):  
Joanne Macdonald ◽  
Jessica Tonry ◽  
Roy A. Hall ◽  
Brent Williams ◽  
Gustavo Palacios ◽  
...  

ABSTRACT The West Nile virus (WNV) nonstructural protein NS1 is a protein of unknown function that is found within, associated with, and secreted from infected cells. We systematically investigated the kinetics of NS1 secretion in vitro and in vivo to determine the potential use of this protein as a diagnostic marker and to analyze NS1 secretion in relation to the infection cycle. A sensitive antigen capture enzyme-linked immunosorbent assay (ELISA) for detection of WNV NS1 (polyclonal-ACE) was developed, as well as a capture ELISA for the specific detection of NS1 multimers (4G4-ACE). The 4G4-ACE detected native NS1 antigens at high sensitivity, whereas the polyclonal-ACE had a higher specificity for recombinant forms of the protein. Applying these assays we found that only a small fraction of intracellular NS1 is secreted and that secretion of NS1 in tissue culture is delayed compared to the release of virus particles. In experimentally infected hamsters, NS1 was detected in the serum between days 3 and 8 postinfection, peaking on day 5, the day prior to the onset of clinical disease; immunoglobulin M (IgM) antibodies were detected at low levels on day 5 postinfection. Although real-time PCR gave the earliest indication of infection (day 1), the diagnostic performance of the 4G4-ACE was comparable to that of real-time PCR during the time period when NS1 was secreted. Moreover, the 4G4-ACE was found to be superior in performance to both the IgM and plaque assays during this time period, suggesting that NS1 is a viable early diagnostic marker of WNV infection.


2013 ◽  
Vol 15 (12) ◽  
pp. 1063-1069 ◽  
Author(s):  
Kathryn S Jenkins ◽  
Keren E Dittmer ◽  
Jonathan C Marshall ◽  
Séverine Tasker

Sign in / Sign up

Export Citation Format

Share Document