scholarly journals A Novel Metagenomic Workflow for Biomonitoring across the Tree of Life using PCR-free Ultra-deep Sequencing of Extracellular eDNA

Author(s):  
Shivakumara Manu ◽  
Govindhaswamy Umapathy

Biodiversity is declining on a planetary scale at an alarming rate due to anthropogenic factors. Classical biodiversity monitoring approaches are time-consuming, resource-intensive, and not scalable to address the current biodiversity crisis. The environmental DNA-based next-generation biomonitoring framework provides an efficient, scalable, and holistic solution for evaluating changes in various ecological entities. However, its scope is currently limited to monitoring targeted groups of organisms using metabarcoding, which suffers from various PCR-induced biases. To utilise the full potential of next-generation biomonitoring, we intended to develop PCR-free genomic technologies that can deliver unbiased biodiversity data across the tree of life in a single assay. Here, we describe a novel metagenomic workflow comprising of a customised extracellular DNA enrichment protocol from large-volume filtered water samples, a completely PCR-free library preparation step, an ultra-deep next-generation sequencing, and a pseudo-taxonomic assignment strategy using the dual lowest common ancestor algorithm. We demonstrate the utility of our approach in a pilot-scale spatially-replicated experimental setup in Chilika, a large hyper-diverse brackish lagoon ecosystem in India. Using incidence-based statistics, we show that biodiversity across the tree of life, from microorganisms to the relatively low-abundant macroorganisms such as Arthropods and Fishes, can be effectively detected with about one billion paired-end reads using our reproducible workflow. With decreasing costs of sequencing and the increasing availability of genomic resources from the earth biogenome project, our approach can be tested in different ecosystems and adapted for large-scale rapid assessment of biodiversity across the tree of life

2021 ◽  
Vol 4 ◽  
Author(s):  
Shivakumara Manu

Biodiversity is declining on a planetary scale at an alarming rate due to anthropogenic factors. Classical biodiversity monitoring approaches are time-consuming, resource-intensive, and not scalable to address the current biodiversity crisis. The environmental DNA-based next-generation biomonitoring framework provides an efficient, scalable, and holistic solution for evaluating changes in various ecological entities. However, its scope is currently limited to monitoring targeted groups of organisms using metabarcoding, which suffers from various PCR-induced biases. To utilise the full potential of next-generation biomonitoring, we intended to develop PCR-free genomic technologies that can deliver unbiased biodiversity data across the tree of life in a single assay. Here, we present a novel metagenomic workflow comprising of a lysis-free extracellular DNA enrichment protocol from large-volume filtered water samples, a completely PCR-free library preparation step, an ultra-deep next-generation sequencing, and a pseudo-taxonomic assignment strategy using the dual lowest common ancestor algorithm. We demonstrate the utility of our approach in a pilot-scale spatially-replicated experimental setup in Chilika, a large hyper-diverse brackish lagoon ecosystem in India. Using incidence-based statistics, we show that biodiversity across the tree of life, from microorganisms to the relatively low-abundant macroorganisms such as Arthropods and Fishes, can be effectively detected with about one billion paired-end reads using our reproducible workflow. With decreasing costs of sequencing and the increasing availability of genomic resources from the earth biogenome project, our approach can be tested in different ecosystems and adapted for large-scale rapid assessment of biodiversity across the tree of life. *1


2019 ◽  
Vol 25 (31) ◽  
pp. 3350-3357 ◽  
Author(s):  
Pooja Tripathi ◽  
Jyotsna Singh ◽  
Jonathan A. Lal ◽  
Vijay Tripathi

Background: With the outbreak of high throughput next-generation sequencing (NGS), the biological research of drug discovery has been directed towards the oncology and infectious disease therapeutic areas, with extensive use in biopharmaceutical development and vaccine production. Method: In this review, an effort was made to address the basic background of NGS technologies, potential applications of NGS in drug designing. Our purpose is also to provide a brief introduction of various Nextgeneration sequencing techniques. Discussions: The high-throughput methods execute Large-scale Unbiased Sequencing (LUS) which comprises of Massively Parallel Sequencing (MPS) or NGS technologies. The Next geneinvolved necessarily executes Largescale Unbiased Sequencing (LUS) which comprises of MPS or NGS technologies. These are related terms that describe a DNA sequencing technology which has revolutionized genomic research. Using NGS, an entire human genome can be sequenced within a single day. Conclusion: Analysis of NGS data unravels important clues in the quest for the treatment of various lifethreatening diseases and other related scientific problems related to human welfare.


Author(s):  
Mohsen Dadfarnia ◽  
Petros Sofronis ◽  
Ian Robertson ◽  
Brian P. Somerday ◽  
Govindarajan Muralidharan ◽  
...  

The technology of large scale hydrogen transmission from central production facilities to refueling stations and stationary power sites is at present undeveloped. Among the problems which confront the implementation of this technology is the deleterious effect of hydrogen on structural material properties, in particular at gas pressure of 1000 psi which is the desirable transmission pressure suggested by economic studies for efficient transport. In this paper, a hydrogen transport methodology for the calculation of hydrogen accumulation ahead of a crack tip in a pipeline steel is outlined. The approach accounts for stress-driven transient diffusion of hydrogen and trapping at microstructural defects whose density may evolve dynamically with deformation. The results are used to discuss a lifetime prediction methodology for failure of materials used for pipelines and welds exposed to high-pressure hydrogen. Development of such predictive capability and strategies is of paramount importance to the rapid assessment of using the natural-gas pipeline distribution system for hydrogen transport and of the susceptibility of new alloys tailored for use in the new hydrogen economy.


2021 ◽  
Vol 17 (4) ◽  
pp. 1-21
Author(s):  
He Wang ◽  
Nicoleta Cucu Laurenciu ◽  
Yande Jiang ◽  
Sorin Cotofana

Design and implementation of artificial neuromorphic systems able to provide brain akin computation and/or bio-compatible interfacing ability are crucial for understanding the human brain’s complex functionality and unleashing brain-inspired computation’s full potential. To this end, the realization of energy-efficient, low-area, and bio-compatible artificial synapses, which sustain the signal transmission between neurons, is of particular interest for any large-scale neuromorphic system. Graphene is a prime candidate material with excellent electronic properties, atomic dimensions, and low-energy envelope perspectives, which was already proven effective for logic gates implementations. Furthermore, distinct from any other materials used in current artificial synapse implementations, graphene is biocompatible, which offers perspectives for neural interfaces. In view of this, we investigate the feasibility of graphene-based synapses to emulate various synaptic plasticity behaviors and look into their potential area and energy consumption for large-scale implementations. In this article, we propose a generic graphene-based synapse structure, which can emulate the fundamental synaptic functionalities, i.e., Spike-Timing-Dependent Plasticity (STDP) and Long-Term Plasticity . Additionally, the graphene synapse is programable by means of back-gate bias voltage and can exhibit both excitatory or inhibitory behavior. We investigate its capability to obtain different potentiation/depression time scale for STDP with identical synaptic weight change amplitude when the input spike duration varies. Our simulation results, for various synaptic plasticities, indicate that a maximum 30% synaptic weight change and potentiation/depression time scale range from [-1.5 ms, 1.1 ms to [-32.2 ms, 24.1 ms] are achievable. We further explore the effect of our proposal at the Spiking Neural Network (SNN) level by performing NEST-based simulations of a small SNN implemented with 5 leaky-integrate-and-fire neurons connected via graphene-based synapses. Our experiments indicate that the number of SNN firing events exhibits a strong connection with the synaptic plasticity type, and monotonously varies with respect to the input spike frequency. Moreover, for graphene-based Hebbian STDP and spike duration of 20ms we obtain an SNN behavior relatively similar with the one provided by the same SNN with biological STDP. The proposed graphene-based synapse requires a small area (max. 30 nm 2 ), operates at low voltage (200 mV), and can emulate various plasticity types, which makes it an outstanding candidate for implementing large-scale brain-inspired computation systems.


2010 ◽  
Vol 133 (3) ◽  
Author(s):  
Myung Gwan Hahm ◽  
Young-Kyun Kwon ◽  
Ahmed Busnaina ◽  
Yung Joon Jung

Due to their unique one-dimensional nanostructure along with excellent mechanical, electrical, and optical properties, carbon nanotubes (CNTs) become a promising material for diverse nanotechnology applications. However, large-scale and structure controlled synthesis of CNTs still have many difficulties due to the lack of understanding of the fundamental growth mechanism of CNTs, as well as the difficulty of controlling atomic-scale physical and chemical reactions during the nanotube growth process. Especially, controlling the number of graphene wall, diameter, and chirality of CNTs are the most important issues that need to be solved to harness the full potential of CNTs. Here we report the large-scale selective synthesis of vertically aligned single walled carbon nanotubes (SWNTs) and double walled carbon nanotubes (DWNTs) by controlling the size of catalyst nanoparticles in the highly effective oxygen assisted thermal chemical vapor deposition (CVD) process. We also demonstrate a simple but powerful strategy for synthesizing ultrahigh density and diameter selected vertically aligned SWNTs through the precise control of carbon flow during a thermal CVD process.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Niloufar Nouri ◽  
Naresh Devineni ◽  
Valerie Were ◽  
Reza Khanbilvardi

AbstractThe annual frequency of tornadoes during 1950–2018 across the major tornado-impacted states were examined and modeled using anthropogenic and large-scale climate covariates in a hierarchical Bayesian inference framework. Anthropogenic factors include increases in population density and better detection systems since the mid-1990s. Large-scale climate variables include El Niño Southern Oscillation (ENSO), Southern Oscillation Index (SOI), North Atlantic Oscillation (NAO), Pacific Decadal Oscillation (PDO), Arctic Oscillation (AO), and Atlantic Multi-decadal Oscillation (AMO). The model provides a robust way of estimating the response coefficients by considering pooling of information across groups of states that belong to Tornado Alley, Dixie Alley, and Other States, thereby reducing their uncertainty. The influence of the anthropogenic factors and the large-scale climate variables are modeled in a nested framework to unravel secular trend from cyclical variability. Population density explains the long-term trend in Dixie Alley. The step-increase induced due to the installation of the Doppler Radar systems explains the long-term trend in Tornado Alley. NAO and the interplay between NAO and ENSO explained the interannual to multi-decadal variability in Tornado Alley. PDO and AMO are also contributing to this multi-time scale variability. SOI and AO explain the cyclical variability in Dixie Alley. This improved understanding of the variability and trends in tornadoes should be of immense value to public planners, businesses, and insurance-based risk management agencies.


2021 ◽  
Vol MA2021-02 (30) ◽  
pp. 944-944
Author(s):  
Keita Sahara ◽  
Ryo Yokogawa ◽  
Tappei Nishihara ◽  
Naomi Sawamoto ◽  
Tianzhuo Zhan ◽  
...  

PLoS ONE ◽  
2015 ◽  
Vol 10 (10) ◽  
pp. e0139868 ◽  
Author(s):  
Mohan A. V. S. K. Katta ◽  
Aamir W. Khan ◽  
Dadakhalandar Doddamani ◽  
Mahendar Thudi ◽  
Rajeev K. Varshney

2021 ◽  
Vol 11 (22) ◽  
pp. 10537
Author(s):  
Adi A. AlQudah ◽  
Mostafa Al-Emran ◽  
Khaled Shaalan

Understanding the factors affecting the use of healthcare technologies is a crucial topic that has been extensively studied, specifically during the last decade. These factors were studied using different technology acceptance models and theories. However, a systematic review that offers extensive understanding into what affects healthcare technologies and services and covers distinctive trends in large-scale research remains lacking. Therefore, this review aims to systematically review the articles published on technology acceptance in healthcare. From a yield of 1768 studies collected, 142 empirical studies have met the eligibility criteria and were extensively analyzed. The key findings confirmed that TAM and UTAUT are the most prevailing models in explaining what affects the acceptance of various healthcare technologies through different user groups, settings, and countries. Apart from the core constructs of TAM and UTAUT, the results showed that anxiety, computer self-efficacy, innovativeness, and trust are the most influential factors affecting various healthcare technologies. The results also revealed that Taiwan and the USA are leading the research of technology acceptance in healthcare, with a remarkable increase in studies focusing on telemedicine and electronic medical records solutions. This review is believed to enhance our understanding through a number of theoretical contributions and practical implications by unveiling the full potential of technology acceptance in healthcare and opening the door for further research opportunities.


Author(s):  
N. I. Mikshis ◽  
P. Yu. Popova ◽  
A. P. Semakova ◽  
V. V. Kutyrev

High pathogenicity of anthrax agent combined with unique insensitivity of its spore forms to environmental stresses class it among extremely dangerous biological agents. Registered and effectively used anthrax vaccines made invaluable contribution to the improvement of epidemiological situation around the world. Nevertheless, neglect of non-specific prophylaxis may result in dramatic scenarios and require large-scale measures on rectification of the consequences. Efforts on the development of next-generation vaccines are aimed at safety build-up, decrease in frequency of administration, and enhancement of manufacturing technologies. The review contains the key information on licensed anthrax vaccines designed for medical use, both in the territory of the Russian Federation and abroad. Among multiple experimental developments emphasized have been preparations manufactured by various biopharmaceutical companies in compliance with GMP standards, at different phases of clinical trials in 2016.


Sign in / Sign up

Export Citation Format

Share Document