scholarly journals The Security of Cryptosystems Based on Error-Correcting Codes

2021 ◽  
Author(s):  
Ahmed Drissi

Quantum computers are distinguished by their enormous storage capacity and relatively high computing speed. Among the cryptosystems of the future, the best known and most studied which will resist when using this kind of computer are cryptosystems based on error-correcting codes. The use of problems inspired by the theory of error-correcting codes in the design of cryptographic systems adds an alternative to cryptosystems based on number theory, as well as solutions to their vulnerabilities. Their security is based on the problem of decoding a random code that is NP-complete. In this chapter, we will discuss the cryptographic properties of error-correcting codes, as well as the security of cryptosystems based on code theory.

2020 ◽  
Vol 8 ◽  
Author(s):  
Hai-Ping Cheng ◽  
Erik Deumens ◽  
James K. Freericks ◽  
Chenglong Li ◽  
Beverly A. Sanders

Chemistry is considered as one of the more promising applications to science of near-term quantum computing. Recent work in transitioning classical algorithms to a quantum computer has led to great strides in improving quantum algorithms and illustrating their quantum advantage. Because of the limitations of near-term quantum computers, the most effective strategies split the work over classical and quantum computers. There is a proven set of methods in computational chemistry and materials physics that has used this same idea of splitting a complex physical system into parts that are treated at different levels of theory to obtain solutions for the complete physical system for which a brute force solution with a single method is not feasible. These methods are variously known as embedding, multi-scale, and fragment techniques and methods. We review these methods and then propose the embedding approach as a method for describing complex biochemical systems, with the parts not only treated with different levels of theory, but computed with hybrid classical and quantum algorithms. Such strategies are critical if one wants to expand the focus to biochemical molecules that contain active regions that cannot be properly explained with traditional algorithms on classical computers. While we do not solve this problem here, we provide an overview of where the field is going to enable such problems to be tackled in the future.


2021 ◽  
Author(s):  
Epari Ritesh Patro ◽  
Carlo De Michele

<p>Reservoir sedimentation has a prominent impact on the hydropower performance in the future and is a growing concern for hydropower stakeholders. Sedimentation caused by soil erosion is influenced by various parameters. Reservoir sedimentation is one of the most challenging problems that affect hydroelectric production since it overall causes a reduction of the reservoir capacity that overcomes the annual increase in storage volume and implies a dangerous net loss of energy. The first part of this study examined various Italian reservoirs (50 dams) to determine sedimentation rates and storage capacity loss based on available bathymetric surveys. All the reservoirs studied here have reached an average age of 74 years as of 2019, with the highest loss of capacity observed at 90% and the highest annual sediment yield of 2471 m<sup>3</sup>/km<sup>2</sup>/year. Out of all the reservoirs studied, 25% of them already have reached their half-life as of 2019. The second part of this study extended the work to the specific case study of the Ceppo Morelli hydropower plant. The study was carried out to analyse the water-sediment interaction, future sediment load and prioritizing of critical soil erosion areas using the Soil and Water Assessment Tool (SWAT). The distinguishing feature of this work lies in the possibility to exploit remote sensing data (i.e. actual/potential evapotranspiration) to successfully calibrate hydrological models in scarce data regions. Simulation results indicated that the discharge and sediment load entering Ceppo Morelli reservoir will decline and the rate of reduction of latter is higher than that of former for all the future climate scenarios implemented. This analysis will provide a starting point for management and prioritization of adaptation and remediation policies for addressing the issue of reservoir sedimentation. These results are part of the RELAID project funded through PRIN-Italy. The aim of this project is to integrate updated knowledge on hydrologic, hydraulics, and sedimentation processes to address the water and flood risk management of impounded Italian rivers through a holistic paradigm.</p><p>Keywords: reservoir sedimentation; hydropower; hydrological modeling; RELAID; Italy</p>


Buildings ◽  
2021 ◽  
Vol 11 (8) ◽  
pp. 338
Author(s):  
Joanna Kołata ◽  
Piotr Zierke

Architects are required to have knowledge of current legislation, ergonomics, and the latest technical solutions. In addition, the design process necessitates an appreciation of the quality of the space and a high degree of creativity. However, it is a profession that has undergone significant changes in recent years due to the pressure exerted by the development of information technology. The designs generated by computer algorithms are becoming such a serious part of designers’ work that some are beginning to question whether they are more the work of computers than humans. There are also increasing suggestions that software development will eventually lead to a situation where humans in the profession will become redundant. This review article aims to present the currently used, implemented, and planned computer technologies employed in the design and consider how they affect and will affect the work of architects in the future. It includes opinions of a wide range of experts on the possibility of computer algorithms replacing architects. The ultimate goal of the article is an attempt to answer the question: will computers eliminate the human factor in the design of the future? It also considers the artificial intelligence or communication skills that computer algorithms would require to achieve this goal. The answers to these questions will contribute not only to determining the future of architecture but will also indicate the current condition of the profession. They will also help us to understand the technologies that are making computers capable of increasingly replacing human professions. Despite differing opinions on the possibility of computer algorithms replacing architects, the conclusions indicate that, currently, computers do not have capabilities and skills to achieve this goal. The speed of technological development, especially such technologies as artificial superintelligence, artificial brains, or quantum computers allows us to predict that the replacement of the architect by machines will be unrealistic in coming decades.


2021 ◽  
Author(s):  
Siyuan Chen ◽  
Peng Zeng ◽  
Kim-Kwang Raymond Choo

Abstract Blind signature is an important cryptographic primitive with widespread applications in secure e-commerce, for example to guarantee participants’ anonymity. Existing blind signature schemes are mostly based on number-theoretic hard problems, which have been shown to be solvable with quantum computers. The National Institute of Standards and Technology (NIST) began in 2017 to specify a new standard for digital signatures by selecting one or more additional signature algorithms, designed to be secure against attacks carried out using quantum computers. However, none of the third-round candidate algorithms are code-based, despite the potential of code-based signature algorithms in resisting quantum computing attacks. In this paper, we construct a new code-based blind signature (CBBS) scheme as an alternative to traditional number-theoretic based schemes. Specifically, we first extend Santoso and Yamaguchi’s three pass identification scheme to a concatenated version (abbreviated as the CSY scheme). Then, we construct our CBBS scheme from the CSY scheme. The security of our CBBS scheme relies on hardness of the syndrome decoding problem in coding theory, which has been shown to be NP-complete and secure against quantum attacks. Unlike Blazy et al.’s CBBS scheme which is based on a zero-knowledge protocol with cheating probability $2/3$, our CBBS scheme is based on a zero-knowledge protocol with cheating probability $1/2$. The lower cheating probability would reduce the interaction rounds under the same security level and thus leads to a higher efficiency. For example, to achieve security level $2^{-82}$, the signature size in our CBBS scheme is $1.63$ MB compared to $3.1$ MB in Blazy et al.’s scheme.


2010 ◽  
Vol 10 (1&2) ◽  
pp. 1-16
Author(s):  
C.R. Laumann ◽  
R. Moessner ◽  
A. Scarddichio ◽  
S.L. Sondhi

Alongside the effort underway to build quantum computers, it is important to better understand which classes of problems they will find easy and which others even they will find intractable. We study random ensembles of the QMA$_1$-complete quantum satisfiability (QSAT) problem introduced by Bravyi \cite{Bravyi:2006p4315}. QSAT appropriately generalizes the NP-complete classical satisfiability (SAT) problem. We show that, as the density of clauses/projectors is varied, the ensembles exhibit quantum phase transitions between phases that are satisfiable and unsatisfiable. Remarkably, almost all instances of QSAT for \emph{any} hypergraph exhibit the same dimension of the satisfying manifold. This establishes the QSAT decision problem as equivalent to a, potentially new, graph theoretic problem and that the hardest typical instances are likely to be localized in a bounded range of clause density.


2014 ◽  
Vol 24 (02) ◽  
pp. 1440001 ◽  
Author(s):  
Max H. Garzon

This is a survey of the origin, current progress and applications of a major roadblock to the development of analytic models for DNA computing (a massively parallel programming methodology) and DNA self-assembly (a nanofabrication methodology), namely the so-called CODEWORD DESIGN problem. The problem calls for finding large sets of single DNA strands that do not crosshybridize to themselves or to their complements and has been recognized as an important problem in DNA computing, self-assembly, DNA memories and phylogenetic analyses because of their error correction and prevention properties. Major recent advances include the development of experimental techniques to search for such codes, as well as a theoretical framework to analyze this problem, despite the fact that it has been proven to be NP-complete using any single concrete metric space to model the Gibbs energy. In this framework, codeword design is reduced to finding large sets of strands maximally separated in DNA spaces and, therefore, the key to finding such sets would lie in knowledge of the geometry of these spaces. A new general technique has been recently found to embed them in Euclidean spaces in a hybridization-affinity-preserving manner, i.e., in such a way that oligos with high/low hybridization affinity are mapped to neighboring/remote points in a geometric lattice, respectively. This isometric embedding materializes long-held metaphors about codeword design in terms of sphere packing and error-correcting codes and leads to designs that are in some cases known to be provably nearly optimal for some oligo sizes. It also leads to upper and lower bounds on estimates of the size of optimal codes of size up to 32–mers, as well as to infinite families of solutions to CODEWORD DESIGN, based on estimates of the kissing (or contact) number for sphere packings in Euclidean spaces. Conversely, this reduction suggests interesting new algorithms to find dense sphere packing solutions in high dimensional spheres using results for CODEWORD DESIGN previously obtained by experimental or theoretical molecular means, as well as a proof that finding these bounds exactly is NP-complete in general. Finally, some research problems and applications arising from these results are described that might be of interest for further research.


2018 ◽  
Vol 5 (4) ◽  
pp. 598-602
Author(s):  
Mu-ming Poo ◽  
Ling Wang

ABSTRACT Quantum computing and quantum computers have attracted much attention from both the academic community and industry in recent years. By exploiting the quantum properties of materials, scientists are aiming to overcome Moore's law of miniaturization and develop novel quantum computers. The concept of quantum computing was first introduced by the distinguished physicist Richard Feynman in 1981. As one of the early pioneers in this field, Turing Award laureate Andrew Chi-Chih Yao made a seminal contribution in developing the theoretical basis for quantum computation in 1993. Since 2011, he has served as the founding director of Tsinghua University's Center for Quantum Information (CQI), which aims to become a world-class research center for quantum computing. In a recent interview with NSR, Yao recounted the history of quantum computing and expressed his view on the future of this field. He suggests that quantum computers could excel in many tasks such as the design of new materials and drugs as well as in the simulation of chemical reactions, but they may not supersede traditional computers in tasks for which traditional computers are already proven to be highly efficient.


Author(s):  
Wei Debao ◽  
Qiao Liyan ◽  
Zhang Peng ◽  
Peng Xiyuan

The lifetime of NAND flash is highly restricted by bit error rate (BER) which would exponentially increase with the number of program/erase cycles. While the error correcting codes (ECC) can only provide a limited error correction ability to tolerate the bit errors. To face this challenge, a novel bad page management (BPM) strategy is proposed to extend the lifetime of NAND flash based on the experimental observations in our hardware-software co-designed experimental platform. The experimental observations indicate that retention error is the dominant type of NAND flash errors, which is caused by the charge leakage in memory cells over time. The BER distribution of retention error shows distinct variance in different pages. The key idea of BPM is to excavate lifetime potency of each page in a block by introducing the fine granularity bad page management instead of the coarse granularity bad block management. In addition, to balance the lifetime enhancement and the storage capacity degradation, a configurable threshold of bad page management (CT-BPM) strategy is proposed to utilize in the storage capacity highly demanded applications. The experimental results show that BPM can provide dozens of times (about 35 times for 3x-nm NAND flash) average lifetime extension without additional hardware cost, while experiencing at most 5% degradation in writing speed.


Sign in / Sign up

Export Citation Format

Share Document