scholarly journals A flexible ChIP-sequencing simulation toolkit

2021 ◽  
Vol 22 (1) ◽  
Author(s):  
An Zheng ◽  
Michael Lamkin ◽  
Yutong Qiu ◽  
Kevin Ren ◽  
Alon Goren ◽  
...  

Abstract Background A major challenge in evaluating quantitative ChIP-seq analyses, such as peak calling and differential binding, is a lack of reliable ground truth data. Accurate simulation of ChIP-seq data can mitigate this challenge, but existing frameworks are either too cumbersome to apply genome-wide or unable to model a number of important experimental conditions in ChIP-seq. Results We present ChIPs, a toolkit for rapidly simulating ChIP-seq data using statistical models of key experimental steps. We demonstrate how ChIPs can be used for a range of applications, including benchmarking analysis tools and evaluating the impact of various experimental parameters. ChIPs is implemented as a standalone command-line program written in C++ and is available from https://github.com/gymreklab/chips. Conclusions ChIPs is an efficient ChIP-seq simulation framework that generates realistic datasets over a flexible range of experimental conditions. It can serve as an important component in various ChIP-seq analyses where ground truth data are needed.

2019 ◽  
Author(s):  
An Zheng ◽  
Michael Lamkin ◽  
Yutong Qiu ◽  
Kevin Ren ◽  
Alon Goren ◽  
...  

AbstractA major challenge in evaluating quantitative ChIP-seq analyses, such as peak calling and differential binding, is a lack of reliable ground truth data. We present Tulip, a toolkit for rapidly simulating ChIP-seq data using statistical models of the experimental steps. Tulip may be used for a range of applications, including power analysis for experimental design, benchmarking of analysis tools, and modeling effects of processes such as replication on ChIP-seq signals.


2020 ◽  
Author(s):  
Gabriel Wright ◽  
Anabel Rodriguez ◽  
Jun Li ◽  
Patricia L. Clark ◽  
Tijana Milenković ◽  
...  

AbstractImproved computational modeling of protein translation rates, including better prediction of where translational slowdowns along an mRNA sequence may occur, is critical for understanding co-translational folding. Because codons within a synonymous codon group are translated at different rates, many computational translation models rely on analyzing synonymous codons. Some models rely on genome-wide codon usage bias (CUB), believing that globally rare and common codons are the most informative of slow and fast translation, respectively. Others use the CUB observed only in highly expressed genes, which should be under selective pressure to be translated efficiently (and whose CUB may therefore be more indicative of translation rates). No prior work has analyzed these models for their ability to predict translational slowdowns. Here, we evaluate five models for their association with slowly translated positions as denoted by two independent ribosome footprint (RFP) count experiments from S. cerevisiae, because RFP data is often considered as a “ground truth” for translation rates across mRNA sequences. We show that all five considered models strongly associate with the RFP data and therefore have potential for estimating translational slowdowns. However, we also show that there is a weak correlation between RFP counts for the same genes originating from independent experiments, even when their experimental conditions are similar. This raises concerns about the efficacy of using current RFP experimental data for estimating translation rates and highlights a potential advantage of using computational models to understand translation rates instead.


Author(s):  
N. Milisavljevic ◽  
D. Closson ◽  
F. Holecz ◽  
F. Collivignarelli ◽  
P. Pasquali

Land-cover changes occur naturally in a progressive and gradual way, but they may happen rapidly and abruptly sometimes. Very high resolution remote sensed data acquired at different time intervals can help in analyzing the rate of changes and the causal factors. In this paper, we present an approach for detecting changes related to disasters such as an earthquake and for mapping of the impact zones. The approach is based on the pieces of information coming from SAR (Synthetic Aperture Radar) and on their combination. The case study is the 22 February 2011 Christchurch earthquake. <br><br> The identification of damaged or destroyed buildings using SAR data is a challenging task. The approach proposed here consists in finding amplitude changes as well as coherence changes before and after the earthquake and then combining these changes in order to obtain richer and more robust information on the origin of various types of changes possibly induced by an earthquake. This approach does not need any specific knowledge source about the terrain, but if such sources are present, they can be easily integrated in the method as more specific descriptions of the possible classes. <br><br> A special task in our approach is to develop a scheme that translates the obtained combinations of changes into ground information. Several algorithms are developed and validated using optical remote sensing images of the city two days after the earthquake, as well as our own ground-truth data. The obtained validation results show that the proposed approach is promising.


2019 ◽  
Author(s):  
Pakhrur Razi

Located on the mountainous area, Kelok Sembilan flyover area in West Sumatra, Indonesia has a long history of land deformation, therefore monitoring and analyzing as continuously is a necessity to minimize the impact. Notably, in the rainy season, the land deformation occurs along this area. The zone is crucial as the center of transportation connection in the middle of Sumatra. Quasi-Persistent Scatterer (Q-PS) Interferometry technique was applied for extracting information of land deformation on the field from time to time. Not only does the method have high performance for detecting land deformation but also improve the number of PS point, especially in a non-urban area. This research supported by 90 scenes of Sentinel-1A (C-band) taken from October 2014 to November 2017 for ascending and descending orbit with VV and VH polarization in 5 × 20 m (range × azimuth) resolution. Both satellite orbits detected two critical locations of land deformation namely as zone A and Zone B, which located in positive steep slope where there is more than 500 mm movement in the Line of Sight (LOS) during acquisition time. Deformations in the vertical and horizontal direction for both zone, are 778.9 mm, 795.7 mm and 730.5 mm, 751.7 mm, respectively. Finally, the results were confirmed by ground truth data using Unmanned Aerial Vehicle (UAV) observation.


Author(s):  
T. Wu ◽  
B. Vallet ◽  
M. Pierrot-Deseilligny ◽  
E. Rupnik

Abstract. Stereo dense matching is a fundamental task for 3D scene reconstruction. Recently, deep learning based methods have proven effective on some benchmark datasets, for example Middlebury and KITTI stereo. However, it is not easy to find a training dataset for aerial photogrammetry. Generating ground truth data for real scenes is a challenging task. In the photogrammetry community, many evaluation methods use digital surface models (DSM) to generate the ground truth disparity for the stereo pairs, but in this case interpolation may bring errors in the estimated disparity. In this paper, we publish a stereo dense matching dataset based on ISPRS Vaihingen dataset, and use it to evaluate some traditional and deep learning based methods. The evaluation shows that learning-based methods outperform traditional methods significantly when the fine tuning is done on a similar landscape. The benchmark also investigates the impact of the base to height ratio on the performance of the evaluated methods. The dataset can be found in https://github.com/whuwuteng/benchmark_ISPRS2021.


2021 ◽  
Author(s):  
John S H Danial ◽  
Yuri Quintana ◽  
Uris Ros ◽  
Raed Shalaby ◽  
Eleonora Germana Margheritis ◽  
...  

Analysis of single molecule brightness allows subunit counting of high-order oligomeric biomolecular complexes. Although the theory behind the method has been extensively assessed, systematic analysis of the experimental conditions required to accurately quantify the stoichiometry of biological complexes remains challenging. In this work, we develop a high-throughput, automated computational pipeline for single molecule brightness analysis that requires minimal human input. We use this strategy to systematically quantify the accuracy of counting under a wide range of experimental conditions in simulated ground-truth data and then validate its use on experimentally obtained data. Our approach defines a set of conditions under which subunit counting by brightness analysis is designed to work optimally and helps establishing the experimental limits in quantifying the number of subunits in a complex of interest. Finally, we combine these features into a powerful, yet simple, software that can be easily used for the stoichiometry analysis of such complexes.


2021 ◽  
Vol 73 (04) ◽  
pp. 24-28
Author(s):  
Judy Feder

“We’re going to the Moon, and we’re going there to stay this time,” has become a NASA mantra as the US competes with other countries, including China and Russia (https://jpt. spe.org/esa-roscosmos-to-mine-oxygen-water-from-moon-rocks-as-nasa-eyes-first-artemis-lunar-mission), to be the first to put humans on the Moon and Mars. The race will rely heavily on using resources available on the planetary bodies - or in-situ resource utilization (ISRU). Chief among these is water, which has been called “the oil of space.” As NASA prepares for Artemis mission astronauts to land on the Moon in 2024, it will fly at least two preliminary missions to look for water and gather information about the lunar south pole. The Polar Resources Ice-Mining Experiment (PRIME-1) and Volatiles Investigating Polar Exploration Rover (VIPER) missions, which will be launched in late 2022 and 2023, respectively, will be the first missions to study ISRU on another celestial body. They will also mark the first time NASA will robotically sample and analyze for ice from below the surface. And they will use technologies transferred and adapted from oil and gas exploration. Reconnaissance Missions Data from nearly 3 decades of lunar orbiter and impactor missions suggest that the Moon’s “soils,” particularly at its south pole and other regions, could contain hundreds of millions of gallons of water that could eventually be harvested and converted to oxygen, fuel, or drinkable water for human use on the Moon, Mars, and beyond. But, at what concentrations? In what kinds of soils? And is the water in a form that’s accessible? Most of the information we have about the presence of water-ice on the Moon comes from orbital measurements. The only direct evidence acquired to date came in 2009 from a sensing satellite aboard a spacecraft that was purposely crashed in the Cabeus crater. The material ejected as a result of the impact was analyzed with a spectrometer to reveal the presence of 5.6%±2.9% water-ice by mass. The form, distribution, composition, and quantity of the water-ice remain largely uncertain. The only way to reduce this uncertainty is to obtain ground-truth data by drilling exploratory boreholes in the crater. This will be the purpose of the PRIME-1 and VIPER missions. PRIME-1 will last a week to 10 days, during which a robot will deploy a drill and mass spectrometer to harvest and preliminarily evaluate moon-ice for quality and regional heights and to determine how much of the ice is lost to a process known as sublimation, wherein the water transforms directly from solid ice into vapor, rather than first going through a liquid phase. In addition to ice, PRIME-1 will gather samples including rock samples to help date the sequence of impact events on the Moon, core tube samples to capture ancient solar wind trapped in regolith layers (unconsolidated, inorganic rocky material), and paired samples of material to characterize the presence of volatiles and to assess geotechnical differences between materials inside and outside permanent shadows. The samples will be returned to Earth and studied to characterize and document the regional geology, including the small, permanently shadowed regions. The data from the mission will help scientists understand how a mobile robot to be used on the subsequent VIPER mission can search for water at the Moon’s pole, and how much water may be available to use as NASA plans to establish a sustainable human presence on the Moon by the end of the decade (Fig. 1).


2004 ◽  
Vol 3 (3) ◽  
pp. 265-271 ◽  
Author(s):  
D.P. Glavin ◽  
J.P. Dworkin ◽  
M. Lupisella ◽  
G. Kminek ◽  
J.D. Rummel

Chemical and microbiological studies of the impact of terrestrial contamination of the lunar surface during the Apollo missions could provide valuable data to help refine future Mars surface exploration plans and planetary protection requirements for a human mission to Mars. NASA and ESA have outlined new visions for solar system exploration that will include a series of lunar robotic missions to prepare for and support a human return to the Moon, and future human exploration of Mars and other destinations. Under the Committee on Space Research's (COSPAR's) current planetary protection policy for the Moon, no decontamination procedures are required for outbound lunar spacecraft. Nonetheless, future in situ investigations of a variety of locations on the Moon by highly sensitive instruments designed to search for biologically derived organic compounds would help assess the contamination of the Moon by lunar spacecraft and Apollo astronauts. These studies could also provide valuable ‘ground truth’ data for Mars sample return missions and help define planetary protection requirements for future Mars bound spacecraft carrying life detection experiments.


1993 ◽  
Vol 1993 (1) ◽  
pp. 141-145 ◽  
Author(s):  
Irving A. Mendelssohn ◽  
Mark W. Hester ◽  
John M. Hill

ABSTRACT The impact of oil spills on coastal environments and the ability of these systems to exhibit long-term recovery has received increased attention in recent years. Although oil spills can have significant short-term impacts on coastal marshes, the long-term effects and eventual recovery are not well documented. Estuarine marshes have sometimes been reported to exhibit slow recovery after oil spills, whereas in other instances they appear to have great resiliency, with complete recovery after one or two years. To document and understand this phenomenon better, we have investigated the long-term recovery of a south Louisiana estuarine marsh exposed to an accidental spill of crude oil. Although a pipeline rupture releasing Louisiana crude oil caused the near complete mortality of a brackish marsh dominated by Spartina patens and S. alterniflora, this marsh completely recovered four years after the spill with no differences in plant species cover between oiled and reference marshes. Remotely sensed imagery of the study site confirmed the relatively rapid recovery demonstrated by the ground truth data. Louisiana's coastal marshes are naturally experiencing rapid rates of deterioration. Land loss rates, determined from aerial imagery, at the spill site and adjacent reference areas before and after the spill demonstrated that the long-term loss rates were not affected by the spill event.


Author(s):  
B.H. Magorrian ◽  
M. Service ◽  
W. Clarke

As part of an investigation into the impact of commercial trawling on the benthos of Strangford Lough a map of the distribution of the benthic communities in the Lough was required. To provide this an acoustic bottom classification survey of the Lough was carried out using a commercially available system, RoxAnn. RoxAnn processes the information from a conventional echo-sounder to determine the nature of different substrata. Underwater cameras were used to obtain ground truth data to compare with the RoxAnn data. Used in conjunction, the two surveys provided valuable information on the different bottom substrata and associated epibenthic communities present in the Lough.


Sign in / Sign up

Export Citation Format

Share Document