scholarly journals Three Reagents for in-Solution Enrichment of Ancient Human DNA at More than a Million SNPs

2022 ◽  
Author(s):  
Nadin Rohland ◽  
Swapan Mallick ◽  
Matthew Mah ◽  
Robert M Maier ◽  
Nick J Patterson ◽  
...  

In-solution enrichment for hundreds of thousands of single nucleotide polymorphisms (SNPs) has been the source of >70% of all genome-scale ancient human DNA data published to date. This approach has made it possible to generate data for one to two orders of magnitude lower cost than random shotgun sequencing, making it economical to study ancient samples with low proportions of human DNA, and increasing the rate of conversion of sampled remains into working data thereby facilitating ethical stewardship of human remains. So far, nearly all ancient DNA data obtained using in-solution enrichment has been generated using a set of bait sequences targeting about 1.24 million SNPs (the 1240k reagent). These sequences were published in 2015, but synthesis of the reagent has been cost-effective for only a few laboratories. In 2021, two companies made available reagents that target the same core set of SNPs along with supplementary content. Here, we test the properties of the three reagents on a common set of 27 ancient DNA libraries across a range of richness of DNA content and percentages of human molecules. All three reagents are highly effective at enriching many hundreds of thousands of SNPs. For all three reagents and a wide range of conditions, one round of enrichment produces data that is as useful as two rounds when tens of millions of sequences are read out as is typical for such experiments. In our testing, the Twist Ancient DNA reagent produces the highest coverages, greatest uniformity on targeted positions, and almost no bias toward enriching one allele more than another relative to shotgun sequencing. Allelic bias in 1240k enrichment has made it challenging to carry out joint analysis of these data with shotgun data, creating a situation where the ancient DNA community has been publishing two important bodes of data that cannot easily be co-analyzed by population genetic methods. To address this challenge, we introduce a subset of hundreds of thousands of SNPs for which 1240k data can be effectively co-analyzed with all other major data types.

2019 ◽  
Vol 36 (1) ◽  
pp. 26-32
Author(s):  
Davoud Torkamaneh ◽  
Jérôme Laroche ◽  
Brian Boyle ◽  
François Belzile

Abstract Motivation Identification of DNA sequence variations such as single nucleotide polymorphisms (SNPs) is a fundamental step toward genetic studies. Reduced-representation sequencing methods have been developed as alternatives to whole genome sequencing to reduce costs and enable the analysis of many more individual. Amongst these methods, restriction site associated sequencing (RSAS) methodologies have been widely used for rapid and cost-effective discovery of SNPs and for high-throughput genotyping in a wide range of species. Despite the extensive improvements of the RSAS methods in the last decade, the estimation of the number of reads (i.e. read depth) required per sample for an efficient and effective genotyping remains mostly based on trial and error. Results Herein we describe a bioinformatics tool, DepthFinder, designed to estimate the required read counts for RSAS methods. To illustrate its performance, we estimated required read counts in six different species (human, cattle, spruce budworm, salmon, barley and soybean) that cover a range of different biological (genome size, level of genome complexity, level of DNA methylation and ploidy) and technical (library preparation protocol and sequencing platform) factors. To assess the prediction accuracy of DepthFinder, we compared DepthFinder-derived results with independent datasets obtained from an RSAS experiment. This analysis yielded estimated accuracies of nearly 94%. Moreover, we present DepthFinder as a powerful tool to predict the most effective size selection interval in RSAS work. We conclude that DepthFinder constitutes an efficient, reliable and useful tool for a broad array of users in different research communities. Availability and implementation https://bitbucket.org/jerlar73/DepthFinder Supplementary information Supplementary data are available at Bioinformatics online.


2020 ◽  
Author(s):  
Luise Schulte ◽  
Nadine Bernhardt ◽  
Kathleen Stoof-Leichsenring ◽  
Heike Zimmermann ◽  
Luidmila Pestryakova ◽  
...  

<p>Siberian larch (<em>Larix</em> Mill.) forests dominate vast areas of northern Russia and contribute important ecosystem services to the earth. To be able to predict future responses of these forests to a changing climate, it is important to understand also past dynamics of larch populations. One well-preserved archive to study vegetation changes of the past is sedimentary ancient DNA (sedaDNA) extracted from lake sediment cores. We studied a lake sediment core covering 6700 calibrated years BP, from the Taymyr region in northern Siberia. To enrich the sedaDNA for DNA of our focal species <em>Larix</em>, we combine shotgun sequencing and hybridization capture with long-range PCR-generated baits covering the complete <em>Larix</em> chloroplast genome. In comparison to shotgun sequencing, hybridization capture results in an increase of taxonomically classified reads by several orders of magnitude and the recovery of near-complete chloroplast genomes of <em>Larix</em>. Variation in the chloroplast reads confirm an invasion of <em>Larix gmelinii</em> into the range of <em>Larix sibirica</em> before 6700 years ago. In this time span, both species can be detected at the site, although larch populations have decreased from a forested area to a single-tree tundra at present. This study demonstrates for the first time that hybridization capture applied to ancient DNA from lake sediments can provide genome-scale information and is a viable tool for studying past changes of a specific taxon.</p>


2020 ◽  
Author(s):  
Tatiana R. Feuerborn ◽  
Elle Palkopoulou ◽  
Tom van der Valk ◽  
Johanna von Seth ◽  
Arielle R. Munters ◽  
...  

AbstractBackgroundAfter over a decade of developments in field collection, laboratory methods and advances in high-throughput sequencing, contamination remains a key issue in ancient DNA research. Currently, human and microbial contaminant DNA still impose challenges on cost-effective sequencing and accurate interpretation of ancient DNA data.ResultsHere we investigate whether human contaminating DNA can be found in ancient faunal sequencing datasets. We identify variable levels of human contamination, which persists even after the sequence reads have been mapped to the faunal reference genomes. This contamination has the potential to affect a range of downstream analyses.ConclusionsWe propose a fast and simple method, based on competitive mapping, which allows identifying and removing human contamination from ancient faunal DNA datasets with limited losses of true ancient data. This method could represent an important tool for the ancient DNA field.


2021 ◽  
Vol 48 (2) ◽  
Author(s):  
Muhammad U. Ghani ◽  
◽  
Muhammad F. Sabar ◽  
Muhammad Akram ◽  
◽  
...  

Single Nucleotide Polymorphisms (SNPs) are in the prime focus of genomic studies for their probable roles in diagnostics and prognosis of diseases and forensic science. SNaPshot/minisequencing reaction-based genotyping of targeted SNPs is a method of choice due to its fast and reliable detection assay. Here we described smart modifications in minisequencing reaction to make it cost-effective to detect 15 SNPs in a single assay. The target SNPs were amplified in a multiplex PCR from genomic DNA, and these multiplex PCR amplicons were utilized as a template in modified SNaPshot reaction for SNPs identification. The modified protocol was assessed for reproducibility on more than 50 human DNA samples, and it was observed that this modified method is at least five times more productive than the original protocol recommended by the manufacturer. The current smart modifications in SNaPshot reaction were successfully optimized for susceptible asthma SNPs. However, these can be applied for cost-effective genotyping of any type of genomic Single Nucleotide Polymorphisms.


Author(s):  
Luise Schulte ◽  
Nadine Bernhardt ◽  
Kathleen R. Stoof-Leichsenring ◽  
Heike H. Zimmermann ◽  
Luidmila A. Pestryakova ◽  
...  

AbstractSiberian larch (Larix Mill.) forests dominate vast areas of northern Russia and contribute important ecosystem services to the world. It is important to understand the past dynamics of larches, in order to predict their likely response to a changing climate in the future. Sedimentary ancient DNA extracted from lake sediment cores can serve as archives to study past vegetation. However, the traditional method of studying sedimentary ancient DNA – metabarcoding – focuses on small fragments which cannot resolve Larix to species level nor allow the detailed study of population dynamics. Here we use shotgun sequencing and hybridization capture with long-range PCR-generated baits covering the complete Larix chloroplast genome to study Larix populations from a sediment core reaching back up to 6700 years in age from the Taymyr region in northern Siberia. In comparison to shotgun sequencing, hybridization capture results in an increase of taxonomically classified reads by several orders of magnitude and the recovery of near-complete chloroplast genomes of Larix. Variation in the chloroplast reads corroborate an invasion of Larix gmelinii into the range of Larix sibirica before 6700 years ago. Since then, both species have been present at the site, although larch populations have decreased with only a few trees remaining in what was once a forested area. This study demonstrates for the first time that hybridization capture applied to ancient DNA from lake sediments can provide genome-scale information and is a viable tool for studying past changes of a specific taxon.


BMC Genomics ◽  
2020 ◽  
Vol 21 (1) ◽  
Author(s):  
Tatiana R. Feuerborn ◽  
Eleftheria Palkopoulou ◽  
Tom van der Valk ◽  
Johanna von Seth ◽  
Arielle R. Munters ◽  
...  

Abstract Background After over a decade of developments in field collection, laboratory methods and advances in high-throughput sequencing, contamination remains a key issue in ancient DNA research. Currently, human and microbial contaminant DNA still impose challenges on cost-effective sequencing and accurate interpretation of ancient DNA data. Results Here we investigate whether human contaminating DNA can be found in ancient faunal sequencing datasets. We identify variable levels of human contamination, which persists even after the sequence reads have been mapped to the faunal reference genomes. This contamination has the potential to affect a range of downstream analyses. Conclusions We propose a fast and simple method, based on competitive mapping, which allows identifying and removing human contamination from ancient faunal DNA datasets with limited losses of true ancient data. This method could represent an important tool for the ancient DNA field.


2020 ◽  
pp. 1192-1198
Author(s):  
M.S. Mohammad ◽  
Tibebe Tesfaye ◽  
Kim Ki-Seong

Ultrasonic thickness gauges are easy to operate and reliable, and can be used to measure a wide range of thicknesses and inspect all engineering materials. Supplementing the simple ultrasonic thickness gauges that present results in either a digital readout or as an A-scan with systems that enable correlating the measured values to their positions on the inspected surface to produce a two-dimensional (2D) thickness representation can extend their benefits and provide a cost-effective alternative to expensive advanced C-scan machines. In previous work, the authors introduced a system for the positioning and mapping of the values measured by the ultrasonic thickness gauges and flaw detectors (Tesfaye et al. 2019). The system is an alternative to the systems that use mechanical scanners, encoders, and sophisticated UT machines. It used a camera to record the probe’s movement and a projected laser grid obtained by a laser pattern generator to locate the probe on the inspected surface. In this paper, a novel system is proposed to be applied to flat surfaces, in addition to overcoming the other limitations posed due to the use of the laser projection. The proposed system uses two video cameras, one to monitor the probe’s movement on the inspected surface and the other to capture the corresponding digital readout of the thickness gauge. The acquired images of the probe’s position and thickness gauge readout are processed to plot the measured data in a 2D color-coded map. The system is meant to be simpler and more effective than the previous development.


Author(s):  
Allan Matthews ◽  
Adrian Leyland

Over the past twenty years or so, there have been major steps forward both in the understanding of tribological mechanisms and in the development of new coating and treatment techniques to better “engineer” surfaces to achieve reductions in wear and friction. Particularly in the coatings tribology field, improved techniques and theories which enable us to study and understand the mechanisms occurring at the “nano”, “micro” and “macro” scale have allowed considerable progress to be made in (for example) understanding contact mechanisms and the influence of “third bodies” [1–5]. Over the same period, we have seen the emergence of the discipline which we now call “Surface Engineering”, by which, ideally, a bulk material (the ‘substrate’) and a coating are combined in a way that provides a cost-effective performance enhancement of which neither would be capable without the presence of the other. It is probably fair to say that the emergence and recognition of Surface Engineering as a field in its own right has been driven largely by the availability of “plasma”-based coating and treatment processes, which can provide surface properties which were previously unachievable. In particular, plasma-assisted (PA) physical vapour deposition (PVD) techniques, allowing wear-resistant ceramic thin films such as titanium nitride (TiN) to be deposited on a wide range of industrial tooling, gave a step-change in industrial productivity and manufactured product quality, and caught the attention of engineers due to the remarkable cost savings and performance improvements obtained. Subsequently, so-called 2nd- and 3rd-generation ceramic coatings (with multilayered or nanocomposite structures) have recently been developed [6–9], to further extend tool performance — the objective typically being to increase coating hardness further, or extend hardness capabilities to higher temperatures.


Biostatistics ◽  
2019 ◽  
Author(s):  
Dane R Van Domelen ◽  
Emily M Mitchell ◽  
Neil J Perkins ◽  
Enrique F Schisterman ◽  
Amita K Manatunga ◽  
...  

SUMMARYMeasuring a biomarker in pooled samples from multiple cases or controls can lead to cost-effective estimation of a covariate-adjusted odds ratio, particularly for expensive assays. But pooled measurements may be affected by assay-related measurement error (ME) and/or pooling-related processing error (PE), which can induce bias if ignored. Building on recently developed methods for a normal biomarker subject to additive errors, we present two related estimators for a right-skewed biomarker subject to multiplicative errors: one based on logistic regression and the other based on a Gamma discriminant function model. Applied to a reproductive health dataset with a right-skewed cytokine measured in pools of size 1 and 2, both methods suggest no association with spontaneous abortion. The fitted models indicate little ME but fairly severe PE, the latter of which is much too large to ignore. Simulations mimicking these data with a non-unity odds ratio confirm validity of the estimators and illustrate how PE can detract from pooling-related gains in statistical efficiency. These methods address a key issue associated with the homogeneous pools study design and should facilitate valid odds ratio estimation at a lower cost in a wide range of scenarios.


Author(s):  
Gary Sutlieff ◽  
Lucy Berthoud ◽  
Mark Stinchcombe

Abstract CBRN (Chemical, Biological, Radiological, and Nuclear) threats are becoming more prevalent, as more entities gain access to modern weapons and industrial technologies and chemicals. This has produced a need for improvements to modelling, detection, and monitoring of these events. While there are currently no dedicated satellites for CBRN purposes, there are a wide range of possibilities for satellite data to contribute to this field, from atmospheric composition and chemical detection to cloud cover, land mapping, and surface property measurements. This study looks at currently available satellite data, including meteorological data such as wind and cloud profiles, surface properties like temperature and humidity, chemical detection, and sounding. Results of this survey revealed several gaps in the available data, particularly concerning biological and radiological detection. The results also suggest that publicly available satellite data largely does not meet the requirements of spatial resolution, coverage, and latency that CBRN detection requires, outside of providing terrain use and building height data for constructing models. Lastly, the study evaluates upcoming instruments, platforms, and satellite technologies to gauge the impact these developments will have in the near future. Improvements in spatial and temporal resolution as well as latency are already becoming possible, and new instruments will fill in the gaps in detection by imaging a wider range of chemicals and other agents and by collecting new data types. This study shows that with developments coming within the next decade, satellites should begin to provide valuable augmentations to CBRN event detection and monitoring. Article Highlights There is a wide range of existing satellite data in fields that are of interest to CBRN detection and monitoring. The data is mostly of insufficient quality (resolution or latency) for the demanding requirements of CBRN modelling for incident control. Future technologies and platforms will improve resolution and latency, making satellite data more viable in the CBRN management field


Sign in / Sign up

Export Citation Format

Share Document