Interactions of sapsuckers and Cytospora canker can facilitate decline of riparian willows

Botany ◽  
2014 ◽  
Vol 92 (7) ◽  
pp. 485-493 ◽  
Author(s):  
Kristen M. Kaczynski ◽  
David J. Cooper ◽  
William R. Jacobi

Drought has caused large-scale plant mortality in ecosystems around the globe. Most diebacks have affected upland forest species. In the past two decades, a large-scale decline of riparian willows (Salix L.) has occurred in Rocky Mountain National Park, Colorado. We examined whether climatic or biotic factors drive and maintain the willow community decline. We compared annual growth and dieback of willows inside and outside of 14-year-old ungulate exclosures and measured groundwater depth and predawn xylem pressures of stems as indicators of drought stress. We also performed an aerial photo analysis to determine the temporal dynamics of the decline. Aerial photo analysis indicated willow decline occurred between 2001 and 2005 and was best explained by an increase in moose population and a decrease in peak stream flows. A new mechanism for willow stem dieback was identified, initiated by red-naped sapsucker wounding willow bark. Wounds became infected with fungus that girdled the stem. DNA analyses confirmed Valsa sordida (Cytospora chrysosperma) as the lethal fungus. Captured sapsuckers had V. sordida spores on feet and beaks identifying them as one possible vector of spread. Predawn xylem pressure potentials remained high through the growing season on all study willows regardless of depth to ground water. Our results indicate that additional mechanisms may be involved in tall willow decline.

2020 ◽  
Author(s):  
Lungwani Muungo

The purpose of this review is to evaluate progress inmolecular epidemiology over the past 24 years in canceretiology and prevention to draw lessons for futureresearch incorporating the new generation of biomarkers.Molecular epidemiology was introduced inthe study of cancer in the early 1980s, with theexpectation that it would help overcome some majorlimitations of epidemiology and facilitate cancerprevention. The expectation was that biomarkerswould improve exposure assessment, document earlychanges preceding disease, and identify subgroupsin the population with greater susceptibility to cancer,thereby increasing the ability of epidemiologic studiesto identify causes and elucidate mechanisms incarcinogenesis. The first generation of biomarkers hasindeed contributed to our understanding of riskandsusceptibility related largely to genotoxic carcinogens.Consequently, interventions and policy changes havebeen mounted to reduce riskfrom several importantenvironmental carcinogens. Several new and promisingbiomarkers are now becoming available for epidemiologicstudies, thanks to the development of highthroughputtechnologies and theoretical advances inbiology. These include toxicogenomics, alterations ingene methylation and gene expression, proteomics, andmetabonomics, which allow large-scale studies, includingdiscovery-oriented as well as hypothesis-testinginvestigations. However, most of these newer biomarkershave not been adequately validated, and theirrole in the causal paradigm is not clear. There is a needfor their systematic validation using principles andcriteria established over the past several decades inmolecular cancer epidemiology.


1987 ◽  
Vol 19 (5-6) ◽  
pp. 701-710 ◽  
Author(s):  
B. L. Reidy ◽  
G. W. Samson

A low-cost wastewater disposal system was commissioned in 1959 to treat domestic and industrial wastewaters generated in the Latrobe River valley in the province of Gippsland, within the State of Victoria, Australia (Figure 1). The Latrobe Valley is the centre for large-scale generation of electricity and for the production of pulp and paper. In addition other industries have utilized the brown coal resource of the region e.g. gasification process and char production. Consequently, industrial wastewaters have been dominant in the disposal system for the past twenty-five years. The mixed industrial-domestic wastewaters were to be transported some eighty kilometres to be treated and disposed of by irrigation to land. Several important lessons have been learnt during twenty-five years of operating this system. Firstly the composition of the mixed waste stream has varied significantly with the passage of time and the development of the industrial base in the Valley, so that what was appropriate treatment in 1959 is not necessarily acceptable in 1985. Secondly the magnitude of adverse environmental impacts engendered by this low-cost disposal procedure was not imagined when the proposal was implemented. As a consequence, clean-up procedures which could remedy the adverse effects of twenty-five years of impact are likely to be costly. The question then may be asked - when the total costs including rehabilitation are considered, is there really a low-cost solution for environmentally safe disposal of complex wastewater streams?


2019 ◽  
Vol 19 (1) ◽  
pp. 4-16 ◽  
Author(s):  
Qihui Wu ◽  
Hanzhong Ke ◽  
Dongli Li ◽  
Qi Wang ◽  
Jiansong Fang ◽  
...  

Over the past decades, peptide as a therapeutic candidate has received increasing attention in drug discovery, especially for antimicrobial peptides (AMPs), anticancer peptides (ACPs) and antiinflammatory peptides (AIPs). It is considered that the peptides can regulate various complex diseases which are previously untouchable. In recent years, the critical problem of antimicrobial resistance drives the pharmaceutical industry to look for new therapeutic agents. Compared to organic small drugs, peptide- based therapy exhibits high specificity and minimal toxicity. Thus, peptides are widely recruited in the design and discovery of new potent drugs. Currently, large-scale screening of peptide activity with traditional approaches is costly, time-consuming and labor-intensive. Hence, in silico methods, mainly machine learning approaches, for their accuracy and effectiveness, have been introduced to predict the peptide activity. In this review, we document the recent progress in machine learning-based prediction of peptides which will be of great benefit to the discovery of potential active AMPs, ACPs and AIPs.


Author(s):  
Jeasik Cho

This book provides the qualitative research community with some insight on how to evaluate the quality of qualitative research. This topic has gained little attention during the past few decades. We, qualitative researchers, read journal articles, serve on masters’ and doctoral committees, and also make decisions on whether conference proposals, manuscripts, or large-scale grant proposals should be accepted or rejected. It is assumed that various perspectives or criteria, depending on various paradigms, theories, or fields of discipline, have been used in assessing the quality of qualitative research. Nonetheless, until now, no textbook has been specifically devoted to exploring theories, practices, and reflections associated with the evaluation of qualitative research. This book constructs a typology of evaluating qualitative research, examines actual information from websites and qualitative journal editors, and reflects on some challenges that are currently encountered by the qualitative research community. Many different kinds of journals’ review guidelines and available assessment tools are collected and analyzed. Consequently, core criteria that stand out among these evaluation tools are presented. Readers are invited to join the author to confidently proclaim: “Fortunately, there are commonly agreed, bold standards for evaluating the goodness of qualitative research in the academic research community. These standards are a part of what is generally called ‘scientific research.’ ”


2021 ◽  
Vol 8 (1) ◽  
Author(s):  
Sungmin O. ◽  
Rene Orth

AbstractWhile soil moisture information is essential for a wide range of hydrologic and climate applications, spatially-continuous soil moisture data is only available from satellite observations or model simulations. Here we present a global, long-term dataset of soil moisture derived through machine learning trained with in-situ measurements, SoMo.ml. We train a Long Short-Term Memory (LSTM) model to extrapolate daily soil moisture dynamics in space and in time, based on in-situ data collected from more than 1,000 stations across the globe. SoMo.ml provides multi-layer soil moisture data (0–10 cm, 10–30 cm, and 30–50 cm) at 0.25° spatial and daily temporal resolution over the period 2000–2019. The performance of the resulting dataset is evaluated through cross validation and inter-comparison with existing soil moisture datasets. SoMo.ml performs especially well in terms of temporal dynamics, making it particularly useful for applications requiring time-varying soil moisture, such as anomaly detection and memory analyses. SoMo.ml complements the existing suite of modelled and satellite-based datasets given its distinct derivation, to support large-scale hydrological, meteorological, and ecological analyses.


2021 ◽  
Vol 4 (1) ◽  
Author(s):  
Mateusz Taszarek ◽  
John T. Allen ◽  
Mattia Marchio ◽  
Harold E. Brooks

AbstractGlobally, thunderstorms are responsible for a significant fraction of rainfall, and in the mid-latitudes often produce extreme weather, including large hail, tornadoes and damaging winds. Despite this importance, how the global frequency of thunderstorms and their accompanying hazards has changed over the past 4 decades remains unclear. Large-scale diagnostics applied to global climate models have suggested that the frequency of thunderstorms and their intensity is likely to increase in the future. Here, we show that according to ERA5 convective available potential energy (CAPE) and convective precipitation (CP) have decreased over the tropics and subtropics with simultaneous increases in 0–6 km wind shear (BS06). Conversely, rawinsonde observations paint a different picture across the mid-latitudes with increasing CAPE and significant decreases to BS06. Differing trends and disagreement between ERA5 and rawinsondes observed over some regions suggest that results should be interpreted with caution, especially for CAPE and CP across tropics where uncertainty is the highest and reliable long-term rawinsonde observations are missing.


2021 ◽  
Vol 7 ◽  
pp. 237802312110201
Author(s):  
Thomas A. DiPrete ◽  
Brittany N. Fox-Williams

Social inequality is a central topic of research in the social sciences. Decades of research have deepened our understanding of the characteristics and causes of social inequality. At the same time, social inequality has markedly increased during the past 40 years, and progress on reducing poverty and improving the life chances of Americans in the bottom half of the distribution has been frustratingly slow. How useful has sociological research been to the task of reducing inequality? The authors analyze the stance taken by sociological research on the subject of reducing inequality. They identify an imbalance in the literature between the discipline’s continual efforts to motivate the plausibility of large-scale change and its lesser efforts to identify feasible strategies of change either through social policy or by enhancing individual and local agency with the potential to cumulate into meaningful progress on inequality reduction.


2021 ◽  
Vol 54 (3) ◽  
pp. 1-33
Author(s):  
Blesson Varghese ◽  
Nan Wang ◽  
David Bermbach ◽  
Cheol-Ho Hong ◽  
Eyal De Lara ◽  
...  

Edge computing is the next Internet frontier that will leverage computing resources located near users, sensors, and data stores to provide more responsive services. Therefore, it is envisioned that a large-scale, geographically dispersed, and resource-rich distributed system will emerge and play a key role in the future Internet. However, given the loosely coupled nature of such complex systems, their operational conditions are expected to change significantly over time. In this context, the performance characteristics of such systems will need to be captured rapidly, which is referred to as performance benchmarking, for application deployment, resource orchestration, and adaptive decision-making. Edge performance benchmarking is a nascent research avenue that has started gaining momentum over the past five years. This article first reviews articles published over the past three decades to trace the history of performance benchmarking from tightly coupled to loosely coupled systems. It then systematically classifies previous research to identify the system under test, techniques analyzed, and benchmark runtime in edge performance benchmarking.


Atmosphere ◽  
2021 ◽  
Vol 12 (2) ◽  
pp. 215
Author(s):  
Na Cheng ◽  
Shuli Song ◽  
Wei Li

The ionosphere is a significant component of the geospace environment. Storm-induced ionospheric anomalies severely affect the performance of Global Navigation Satellite System (GNSS) Positioning, Navigation, and Timing (PNT) and human space activities, e.g., the Earth observation, deep space exploration, and space weather monitoring and prediction. In this study, we present and discuss the multi-scale ionospheric anomalies monitoring over China using the GNSS observations from the Crustal Movement Observation Network of China (CMONOC) during the 2015 St. Patrick’s Day storm. Total Electron Content (TEC), Ionospheric Electron Density (IED), and the ionospheric disturbance index are used to monitor the storm-induced ionospheric anomalies. This study finally reveals the occurrence of the large-scale ionospheric storms and small-scale ionospheric scintillation during the storm. The results show that this magnetic storm was accompanied by a positive phase and a negative phase ionospheric storm. At the beginning of the main phase of the magnetic storm, both TEC and IED were significantly enhanced. There was long-duration depletion in the topside ionospheric TEC during the recovery phase of the storm. This study also reveals the response and variations in regional ionosphere scintillation. The Rate of the TEC Index (ROTI) was exploited to investigate the ionospheric scintillation and compared with the temporal dynamics of vertical TEC. The analysis of the ROTI proved these storm-induced TEC depletions, which suppressed the occurrence of the ionospheric scintillation. To improve the spatial resolution for ionospheric anomalies monitoring, the regional Three-Dimensional (3D) ionospheric model is reconstructed by the Computerized Ionospheric Tomography (CIT) technique. The spatial-temporal dynamics of ionospheric anomalies during the severe geomagnetic storm was reflected in detail. The IED varied with latitude and altitude dramatically; the maximum IED decreased, and the area where IEDs were maximum moved southward.


Sign in / Sign up

Export Citation Format

Share Document