Evolution of tree killing in bark beetles (Coleoptera: Curculionidae): trade-offs between the maddening crowds and a sticky situation

2013 ◽  
Vol 145 (5) ◽  
pp. 471-495 ◽  
Author(s):  
B.S. Lindgren ◽  
K.F. Raffa

AbstractBark beetles (Coleoptera: Curculionidae: Scolytinae) play important roles in temperate conifer ecosystems, and also cause substantial economic losses. Although their general life histories are relatively similar, different species vary markedly in the physiological condition of the hosts they select. Most of ∼6000 known species colonise dead or stressed trees, a resource they share with a large diversity of insects and other organisms. A small number of bark beetle species kill healthy, live trees. These few are of particular interest as they compete directly with humans for resources. We propose that tree killing evolved when intense interspecific competition in the ephemeral, scarce resource of defence-impaired trees selected for genotypes that allowed them to escape this limitation by attacking relatively healthy trees. These transitions were uncommon, and we suggest they were facilitated by (a) genetically and phenotypically flexible host selection behaviours, (b) biochemical adaptations for detoxifying a wide range of defence compounds, and (c) associations with symbionts, which together aided bark beetles in overcoming formidable constitutive and induced host defences. The ability to detoxify terpenes influenced the evolutionary course of pheromonal communication. Specifically, a mate attraction system, which was exploited by intraspecific competitors in locating poorly defended hosts, became a system of cooperative attack in which emitters benefit from the contributions responders make in overcoming defence. This functional shift in communication was driven in part by linkage of beetle semiochemistry to host defence chemistry. Behavioural and phenological adaptations also improved the beetles’ abilities to detect when tree defences are impaired, and, where compatible with life history adaptations to other selective forces, for flight to coincide with seasonally predictable host stress agents. We propose a conceptual model, whereby the above mechanisms enable beetles to concentrate on those trees that offer an optimal trade-off between host defence and interspecific competition, along dynamic gradients of tree vigour and stand-level beetle density. We offer suggestions for future research on testing elements of this model.

2020 ◽  
Vol 7 (4) ◽  
pp. 162
Author(s):  
Ahmed Gareh ◽  
Mahmoud Soliman ◽  
Amira A. Saleh ◽  
Fatma A. El-Gohary ◽  
Heba M. M. El-Sherbiny ◽  
...  

Sarcocystosis is considered one of the major parasitic diseases with a worldwide distribution. It is caused by the obligatory intracellular protozoan parasites of the genus Sarcocystis. Besides its public health issues, sarcocystosis results in significant economic losses due to its impact on productivity and milk yield. A wide range of final and intermediate hosts have been identified, including mammals, birds, and reptiles; however, few studies have investigated the contribution of camels to maintaining the epidemiological foci of the disease in countries such as Egypt. The present study was conducted to grossly and histopathologically identify the prevalence rate of Sarcocystis spp. in camels (N = 100) from the Aswan Governorate, Egypt. Furthermore, the major risk factors related to the development of sarcocystosis in camels were investigated. Samples from the diaphragm, cardiac muscle, esophagus, and testes of the slaughtered camels were collected. Interestingly, Sarcocystis was detected in 75% of the examined camels. Following the studied variable factors, camels aged 5 years or more were found to be at higher risk, with an infection rate of 87.7% (57 of 65) than those younger than 5 years. The infection rate was 81.4% (57 of 70) in males and 60% (18 of 30) in females. The esophagus was the most affected organ (49%), followed by the diaphragm (26%) and cardiac muscle (17%), whereas none of the testes samples were affected. Taken together, the present study demonstrates the high prevalence of Sarcocystis in the examined camels and suggests the importance of these animals in preserving the epidemiological foci of sarcocystosis in Egypt. Future research should map the circulating strains in Egypt and aim to raise public health awareness about the importance of sarcocystosis and other related zoonotic diseases.


2010 ◽  
Vol 67 (7) ◽  
pp. 1086-1097 ◽  
Author(s):  
Christian Jørgensen ◽  
Øyvind Fiksen

When trade-offs involving predation and mortality are perturbed by human activities, behaviour and life histories are expected to change, with consequences for natural mortality rates. We present a general life history model for fish in which three common relationships link natural mortality to life history traits and behaviour. First, survival increases with body size. Second, survival declines with growth rate due to risks involved with resource acquisition and allocation. Third, fish that invest heavily in reproduction suffer from decreased survival due to costly reproductive behaviour or morphology that makes escapes from predators less successful. The model predicts increased natural mortality rate as an adaptive response to harvesting. This extends previous models that have shown that harvesting may cause smaller body size, higher growth rates, and higher investment in reproduction. The predicted increase in natural mortality is roughly half the fishing mortality over a wide range of harvest levels and parameter combinations such that fishing two fish kills three after evolutionary adaptations have taken place.


Plant Disease ◽  
2012 ◽  
Vol 96 (11) ◽  
pp. 1588-1600 ◽  
Author(s):  
Leah L. Granke ◽  
Lina Quesada-Ocampo ◽  
Kurt Lamour ◽  
Mary K. Hausbeck

Since L. H. Leonian's first description of Phytophthora capsici as a pathogen of chile pepper in 1922, we have made many advances in our understanding of this pathogen's biology, host range, dissemination, and management. P. capsici causes foliar blighting, damping-off, wilting, and root, stem, and fruit rot of susceptible hosts, and economic losses are experienced annually in vegetable crops including cucurbits and peppers. Symptoms of P. capsici infection may manifest as stunting, girdling, or cankers for some cultivars or crops that are less susceptible. P. capsici continues to be a constraint on production, and implementation of an aggressive integrated management scheme can still result in insufficient control when weather is favorable for disease. Management of diseases caused by P. capsici is currently limited by the long-term survival of the pathogen as oospores in the soil, a wide host range, long-distance movement of the pathogen in surface water used for irrigation, the presence of fungicide-resistant pathogen populations, and a lack of commercially acceptable resistant host varieties. P. capsici can infect a wide range of hosts under laboratory and greenhouse conditions including cultivated crops, ornamentals, and native plants belonging to diverse plant families. As our understanding of P. capsici continues to grow, future research should focus on developing novel and effective solutions to manage this pathogen and prevent economic losses due to the diseases it causes.


2021 ◽  
Vol 12 ◽  
Author(s):  
Markus Rienth ◽  
Nicolas Vigneron ◽  
Robert P. Walker ◽  
Simone Diego Castellarin ◽  
Crystal Sweetman ◽  
...  

The grapevine is subject to high number of fungal and viral diseases, which are responsible for important economic losses in the global wine sector every year. These pathogens deteriorate grapevine berry quality either directly via the modulation of fruit metabolic pathways and the production of endogenous compounds associated with bad taste and/or flavor, or indirectly via their impact on vine physiology. The most common and devastating fungal diseases in viticulture are gray mold, downy mildew (DM), and powdery mildew (PM), caused, respectively by Botrytis cinerea, Plasmopara viticola, and Erysiphe necator. Whereas B. cinerea mainly infects and deteriorates the ripening fruit directly, deteriorations by DM and PM are mostly indirect via a reduction of photosynthetic leaf area. Nevertheless, mildews can also infect berries at certain developmental stages and directly alter fruit quality via the biosynthesis of unpleasant flavor compounds that impair ultimate wine quality. The grapevine is furthermore host of a wide range of viruses that reduce vine longevity, productivity and berry quality in different ways. The most widespread virus-related diseases, that are known nowadays, are Grapevine Leafroll Disease (GLRD), Grapevine Fanleaf Disease (GFLD), and the more recently characterized grapevine red blotch disease (GRBD). Future climatic conditions are creating a more favorable environment for the proliferation of most virus-insect vectors, so the spread of virus-related diseases is expected to increase in most wine-growing regions. However, the impact of climate change on the evolution of fungal disease pressure will be variable and depending on region and pathogen, with mildews remaining certainly the major phytosanitary threat in most regions because their development rate is to a large extent temperature-driven. This paper aims to provide a review of published literature on most important grapevine fungal and viral pathogens and their impact on grape berry physiology and quality. Our overview of the published literature highlights gaps in our understanding of plant-pathogen interactions, which are valuable for conceiving future research programs dealing with the different pathogens and their impacts on grapevine berry quality and metabolism.


Plant Disease ◽  
2002 ◽  
Vol 86 (11) ◽  
pp. 1275-1275 ◽  
Author(s):  
T. Jung ◽  
G. Dobler

Pinus occidentalis Sw. is an endemic species of the Caribbean island of Hispaniola (Dominican Republic and Haiti). It shows an extreme ecological plasticity and grows on a wide range of soil types from 0 to 3,175 m in elevation with annual mean temperatures ranging from 6 to 25°C and annual precipitation of 800 to 2,300 mm. P. occidentalis is a major component of forests above 800 m in elevation and forms pure climax forests above 2,000 m (4). For more than 10 years, stands of P. occidentalis in the Sierra (Cordillera Central) growing on a wide range of site conditions have suffered from a serious widespread disease. Symptoms include yellowing and dwarfing of needles, a progressive defoliation and dieback of the crown, and finally, death of weakened trees often caused by attacks by secondary bark beetles. Mature stands are mainly affected, but the disease is also present in plantations and natural regeneration that is older than 10 years. Disease spread is rapid, and occurs mainly along roads and from diseased trees downslope following the path of water runoff. Initially, Leptographium serpens was isolated from necrotic roots and was thought to be the causal agent (1). However, the symptoms of the disease more closely resemble those of littleleaf disease of P. echinata and P. taeda in the southeastern United States, which is caused by the aggressive fine-root pathogen Phytophthora cinnamomi Rands (3). Moreover, spread and dynamics of the disease are similar to the diebacks of Chamaecyparis lawsoniana in Oregon and Eucalyptus spp. in western Australia, which are caused by the introduced soilborne pathogens Phytophthora lateralis and Phytophthora cinnamomi, respectively. Soil samples containing the rhizosphere and fine roots of diseased P. occidentalis trees were collected in February 2002 at five sites near Celestina and Los Montones (Dominican Republic) and transported to the Bavarian State Institute of Forestry. The pathogen was baited from the soil by floating 3- to 7-dayold leaves of Quercus robur seedlings over flooded soil and placing the leaves on selective PARPNH agar (2). Phytophthora cinnamomi was isolated from the soil of all five sites. Crossing with A1 and A2 tester strains of Phytophthora cinnamomi confirmed that all isolates belong to the A2 mating type. In cross sections of necrotic fine roots, characteristic structures of Phytophthora cinnamomi such as nonseptate hyphae and chlamydospores could be observed. Our results indicate that the disease of P. occidentalis is caused by the introduced pathogen Phytophthora cinnamomi. Because of the ecological and economical importance of P. occidentalis, the disease poses a major threat to forestry in the Dominican Republic. Future research should include the mapping of the disease, pathogenicity tests on P. occidentalis and alternative pine species, in particular P. caribaea, screening for resistance in the field, and testing of systemic fungicides such as potassium phosphonate, which is known to be effective against Phytophthora cinnamomi. References: (1) G. Dobler. Manejo y Tablas de Rendimiento de Pinus occidentalis. Plan Sierra, San José de las Matas, Dominican Republic, 1999. (2) T. Jung et al. Plant Pathol. 49:706, 2000. (3) S. W. Oak and F. H. Tainter. How to identify and control littleleaf disease. Protection Rep. R8-PR12, USDA Forest Service Southern Region, Atlanta, Georgia, 1988. (4) L. Sprich. Allg. Forst. Jagdztg. 168:67, 1997.


2015 ◽  
Vol 72 (3) ◽  
pp. 319-342 ◽  
Author(s):  
Neala W. Kendall ◽  
John R. McMillan ◽  
Matthew R. Sloat ◽  
Thomas W. Buehrens ◽  
Thomas P. Quinn ◽  
...  

Oncorhynchus mykiss form partially migratory populations with anadromous fish that undergo marine migrations and residents that complete their life cycle in fresh water. Many populations’ anadromous components are threatened or endangered, prompting interest in understanding ecological and evolutionary processes underlying anadromy and residency. In this paper, we synthesize information to better understand genetic and environmental influences on O. mykiss life histories, identify critical knowledge gaps, and suggest next steps. Anadromy and residency appear to reflect interactions among genetics, individual condition, and environmental influences. First, an increasing body of literature suggests that anadromous and resident individuals differ in the expression of genes related to growth, smoltification, and metabolism. Second, the literature supports the conditional strategy theory, where individuals adopt a life history pattern based on their conditional status relative to genetic thresholds along with ultimate effects of size and age at maturation and iteroparity. However, except for a generally positive association between residency and high lipid content plus a large attainable size in fresh water, the effects of body size and growth are inconsistent. Thus, individuals can exhibit plasticity in variable environments. Finally, patterns in anadromy and residency among and within populations suggested a wide range of possible environmental influences at different life stages, from freshwater temperature to marine survival. Although we document a number of interesting correlations, direct tests of mechanisms are scarce and little data exist on the extent of residency and anadromy. Consequently, we identified as many data gaps as conclusions, leaving ample room for future research.


2019 ◽  
Vol 50 (4) ◽  
pp. 693-702 ◽  
Author(s):  
Christine Holyfield ◽  
Sydney Brooks ◽  
Allison Schluterman

Purpose Augmentative and alternative communication (AAC) is an intervention approach that can promote communication and language in children with multiple disabilities who are beginning communicators. While a wide range of AAC technologies are available, little is known about the comparative effects of specific technology options. Given that engagement can be low for beginning communicators with multiple disabilities, the current study provides initial information about the comparative effects of 2 AAC technology options—high-tech visual scene displays (VSDs) and low-tech isolated picture symbols—on engagement. Method Three elementary-age beginning communicators with multiple disabilities participated. The study used a single-subject, alternating treatment design with each technology serving as a condition. Participants interacted with their school speech-language pathologists using each of the 2 technologies across 5 sessions in a block randomized order. Results According to visual analysis and nonoverlap of all pairs calculations, all 3 participants demonstrated more engagement with the high-tech VSDs than the low-tech isolated picture symbols as measured by their seconds of gaze toward each technology option. Despite the difference in engagement observed, there was no clear difference across the 2 conditions in engagement toward the communication partner or use of the AAC. Conclusions Clinicians can consider measuring engagement when evaluating AAC technology options for children with multiple disabilities and should consider evaluating high-tech VSDs as 1 technology option for them. Future research must explore the extent to which differences in engagement to particular AAC technologies result in differences in communication and language learning over time as might be expected.


2015 ◽  
Vol 25 (1) ◽  
pp. 15-23 ◽  
Author(s):  
Ryan W. McCreery ◽  
Elizabeth A. Walker ◽  
Meredith Spratford

The effectiveness of amplification for infants and children can be mediated by how much the child uses the device. Existing research suggests that establishing hearing aid use can be challenging. A wide range of factors can influence hearing aid use in children, including the child's age, degree of hearing loss, and socioeconomic status. Audiological interventions, including using validated prescriptive approaches and verification, performing on-going training and orientation, and communicating with caregivers about hearing aid use can also increase hearing aid use by infants and children. Case examples are used to highlight the factors that influence hearing aid use. Potential management strategies and future research needs are also discussed.


2009 ◽  
Vol 23 (4) ◽  
pp. 191-198 ◽  
Author(s):  
Suzannah K. Helps ◽  
Samantha J. Broyd ◽  
Christopher J. James ◽  
Anke Karl ◽  
Edmund J. S. Sonuga-Barke

Background: The default mode interference hypothesis ( Sonuga-Barke & Castellanos, 2007 ) predicts (1) the attenuation of very low frequency oscillations (VLFO; e.g., .05 Hz) in brain activity within the default mode network during the transition from rest to task, and (2) that failures to attenuate in this way will lead to an increased likelihood of periodic attention lapses that are synchronized to the VLFO pattern. Here, we tested these predictions using DC-EEG recordings within and outside of a previously identified network of electrode locations hypothesized to reflect DMN activity (i.e., S3 network; Helps et al., 2008 ). Method: 24 young adults (mean age 22.3 years; 8 male), sampled to include a wide range of ADHD symptoms, took part in a study of rest to task transitions. Two conditions were compared: 5 min of rest (eyes open) and a 10-min simple 2-choice RT task with a relatively high sampling rate (ISI 1 s). DC-EEG was recorded during both conditions, and the low-frequency spectrum was decomposed and measures of the power within specific bands extracted. Results: Shift from rest to task led to an attenuation of VLFO activity within the S3 network which was inversely associated with ADHD symptoms. RT during task also showed a VLFO signature. During task there was a small but significant degree of synchronization between EEG and RT in the VLFO band. Attenuators showed a lower degree of synchrony than nonattenuators. Discussion: The results provide some initial EEG-based support for the default mode interference hypothesis and suggest that failure to attenuate VLFO in the S3 network is associated with higher synchrony between low-frequency brain activity and RT fluctuations during a simple RT task. Although significant, the effects were small and future research should employ tasks with a higher sampling rate to increase the possibility of extracting robust and stable signals.


2020 ◽  
Author(s):  
Sina Faizollahzadeh Ardabili ◽  
Amir Mosavi ◽  
Pedram Ghamisi ◽  
Filip Ferdinand ◽  
Annamaria R. Varkonyi-Koczy ◽  
...  

Several outbreak prediction models for COVID-19 are being used by officials around the world to make informed-decisions and enforce relevant control measures. Among the standard models for COVID-19 global pandemic prediction, simple epidemiological and statistical models have received more attention by authorities, and they are popular in the media. Due to a high level of uncertainty and lack of essential data, standard models have shown low accuracy for long-term prediction. Although the literature includes several attempts to address this issue, the essential generalization and robustness abilities of existing models needs to be improved. This paper presents a comparative analysis of machine learning and soft computing models to predict the COVID-19 outbreak as an alternative to SIR and SEIR models. Among a wide range of machine learning models investigated, two models showed promising results (i.e., multi-layered perceptron, MLP, and adaptive network-based fuzzy inference system, ANFIS). Based on the results reported here, and due to the highly complex nature of the COVID-19 outbreak and variation in its behavior from nation-to-nation, this study suggests machine learning as an effective tool to model the outbreak. This paper provides an initial benchmarking to demonstrate the potential of machine learning for future research. Paper further suggests that real novelty in outbreak prediction can be realized through integrating machine learning and SEIR models.


Sign in / Sign up

Export Citation Format

Share Document