scholarly journals Overcoming High Nanopore Basecaller Error Rates for DNA Storage Via Basecaller-Decoder Integration and Convolutional Codes

2019 ◽  
Author(s):  
Shubham Chandak ◽  
Joachim Neu ◽  
Kedar Tatwawadi ◽  
Jay Mardia ◽  
Billy Lau ◽  
...  

ABSTRACTAs magnetization and semiconductor based storage technologies approach their limits, bio-molecules, such as DNA, have been identified as promising media for future storage systems, due to their high storage density (petabytes/gram) and long-term durability (thousands of years). Furthermore, nanopore DNA sequencing enables high-throughput sequencing using devices as small as a USB thumb drive and thus is ideally suited for DNA storage applications. Due to the high insertion/deletion error rates associated with basecalled nanopore reads, current approaches rely heavily on consensus among multiple reads and thus incur very high reading costs. We propose a novel approach which overcomes the high error rates in basecalled sequences by integrating a Viterbi error correction decoder with the basecaller, enabling the decoder to exploit the soft information available in the deep learning based basecaller pipeline. Using convolutional codes for error correction, we experimentally observed 3x lower reading costs than the state-of-the-art techniques at comparable writing costs.The code, data and Supplementary Material is available at https://github.com/shubhamchandak94/nanopore_dna_storage.

PLoS ONE ◽  
2021 ◽  
Vol 16 (1) ◽  
pp. e0245506
Author(s):  
Weiping Peng ◽  
Shuang Cui ◽  
Cheng Song

In order to solve the problems of low computational security in the encoding mapping and difficulty in practical operation of biological experiments in DNA-based one-time-pad cryptography, we proposed a one-time-pad cipher algorithm based on confusion mapping and DNA storage technology. In our constructed algorithm, the confusion mapping methods such as chaos map, encoding mapping, confusion encoding table and simulating biological operation process are used to increase the key space. Among them, the encoding mapping and the confusion encoding table provide the realization conditions for the transition of data and biological information. By selecting security parameters and confounding parameters, the algorithm realizes a more random dynamic encryption and decryption process than similar algorithms. In addition, the use of DNA storage technologies including DNA synthesis and high-throughput sequencing ensures a viable biological encryption process. Theoretical analysis and simulation experiments show that the algorithm provides both mathematical and biological security, which not only has the difficult advantage of cracking DNA biological experiments, but also provides relatively high computational security.


2015 ◽  
Vol 22 (04) ◽  
pp. 26-50
Author(s):  
Ngoc Tran Thi Bich ◽  
Huong Pham Hoang Cam

This paper aims to examine the main determinants of inflation in Vietnam during the period from 2002Q1 to 2013Q2. The cointegration theory and the Vector Error Correction Model (VECM) approach are used to examine the impact of domestic credit, interest rate, budget deficit, and crude oil prices on inflation in both long and short terms. The results show that while there are long-term relations among inflation and the others, such factors as oil prices, domestic credit, and interest rate, in the short run, have no impact on fluctuations of inflation. Particularly, the budget deficit itself actually has a short-run impact, but its level is fundamentally weak. The cause of the current inflation is mainly due to public's expectations of the inflation in the last period. Although the error correction, from the long-run relationship, has affected inflation in the short run, the coefficient is small and insignificant. In other words, it means that the speed of the adjustment is very low or near zero. This also implies that once the relationship among inflation, domestic credit, interest rate, budget deficit, and crude oil prices deviate from the long-term trend, it will take the economy a lot of time to return to the equilibrium state.


Author(s):  
Behnam Jahangiri ◽  
Punyaslok Rath ◽  
Hamed Majidifard ◽  
William G. Buttlar

Various agencies have begun to research and introduce performance-related specifications (PRS) for the design of modern asphalt paving mixtures. The focus of most recent studies has been directed toward simplified cracking test development and evaluation. In some cases, development and validation of PRS has been performed, building on these new tests, often by comparison of test values to accelerated pavement test studies and/or to limited field data. This study describes the findings of a comprehensive research project conducted at Illinois Tollway, leading to a PRS for the design of mainline and shoulder asphalt mixtures. A novel approach was developed, involving the systematic establishment of specification requirements based on: 1) selection of baseline values based on minimally acceptable field performance thresholds; 2) elevation of thresholds to account for differences between short-term lab aging and expected long-term field aging; 3) further elevation of thresholds to account for variability in lab testing, plus variability in the testing of field cores; and 4) final adjustment and rounding of thresholds based on a consensus process. After a thorough evaluation of different candidate cracking tests in the course of the project, the Disk-shaped Compact Tension—DC(T)—test was chosen to be retained in the Illinois Tollway PRS and to be presented in this study for the design of crack-resistant mixtures. The DC(T) test was selected because of its high degree of correlation with field results and its excellent repeatability. Tailored Hamburg rut depth and stripping inflection point thresholds were also established for mainline and shoulder mixes.


Nature ◽  
2021 ◽  
Vol 595 (7867) ◽  
pp. 383-387
Author(s):  
◽  
Zijun Chen ◽  
Kevin J. Satzinger ◽  
Juan Atalaya ◽  
Alexander N. Korotkov ◽  
...  

AbstractRealizing the potential of quantum computing requires sufficiently low logical error rates1. Many applications call for error rates as low as 10−15 (refs. 2–9), but state-of-the-art quantum platforms typically have physical error rates near 10−3 (refs. 10–14). Quantum error correction15–17 promises to bridge this divide by distributing quantum logical information across many physical qubits in such a way that errors can be detected and corrected. Errors on the encoded logical qubit state can be exponentially suppressed as the number of physical qubits grows, provided that the physical error rates are below a certain threshold and stable over the course of a computation. Here we implement one-dimensional repetition codes embedded in a two-dimensional grid of superconducting qubits that demonstrate exponential suppression of bit-flip or phase-flip errors, reducing logical error per round more than 100-fold when increasing the number of qubits from 5 to 21. Crucially, this error suppression is stable over 50 rounds of error correction. We also introduce a method for analysing error correlations with high precision, allowing us to characterize error locality while performing quantum error correction. Finally, we perform error detection with a small logical qubit using the 2D surface code on the same device18,19 and show that the results from both one- and two-dimensional codes agree with numerical simulations that use a simple depolarizing error model. These experimental demonstrations provide a foundation for building a scalable fault-tolerant quantum computer with superconducting qubits.


2019 ◽  
Author(s):  
Ala Suleiman ◽  
Bashar Hilal ◽  
Phalgun Paila ◽  
Sahir Abdelhadi ◽  
Khalid Alwahedi ◽  
...  

1990 ◽  
Vol 216 ◽  
Author(s):  
Paul A. Clifton ◽  
Paul D. Brown

ABSTRACTThe interface between Hg1-xCdxTe(0 ≦ x ≦ 1) and Hg1-yCdyTe(0 ≦ y ≦ 1) epitaxial layers of different composition (x ≠ y) is unstable with regard to the intermixing of the Hg and Cd cations within the Group II sublattice. This phenomenon may give rise to long-term stability problems in HgTe-(Hg,Cd)Te superlattices and composition grading between (Hg,Cd)Te absorber layers and CdTe buffer or passivation layers in epitaxial infra red detectors. In this paper, a novel approach to the inhibition of interdiffusion in these systems is discussed. This involves the growth of an intervening ZnTe barrier layer at the heterointerface between two (Hg,Cd)Te layers. Initial results are presented which indicate the effectiveness of this technique in reducing interdiffusion in an experimental heterostructure grown by MOVPE. Some possible applications in a variety of HgTe-based long wavelength devices are discussed.


2013 ◽  
Vol 662 ◽  
pp. 896-901
Author(s):  
Zong Jin Liu ◽  
Yang Yang ◽  
Zheng Fang ◽  
Yan Yan Xu

Because of rapid development of wireless communication technology, there is an increasing adoption of mobile advertising, such as location based advertising (LBA). To what extent can LBA improve advertising effectiveness is an important topic in the field of wireless communication technology research. Most researches quantify long term impacts of advertisings by VAR (Vector Autoregressive) model. However, compared to VAR model, VECM (Vector Error Correction Model) is a better method in that it allows one to estimate both a long-term equilibrium relationship and a short-term dynamic error correction process. In this study, we employ VECM to explore LBA’s (Location Based Advertising) and PUA’s (Pop-up Advertising) sales impact in both short and long terms. The developed VECM reveals that LBA’s sales impact is about more than2 times as big as PUA’s in short dynamic term and nearly 6 times bigger than PUA’s in long equilibrium term. These findings add to advertising and VECM literatures. These results can give managers more confident to apply wireless communication technology to advertising.


Sign in / Sign up

Export Citation Format

Share Document