information reconciliation
Recently Published Documents


TOTAL DOCUMENTS

67
(FIVE YEARS 22)

H-INDEX

9
(FIVE YEARS 3)

Author(s):  
Maqsood M. Khan ◽  
Inam Bari ◽  
Omar Khan ◽  
Najeeb Ullah ◽  
Marina Mondin ◽  
...  

Quantum key distribution (QKD) is a cryptographic communication protocol that utilizes quantum mechanical properties for provable absolute security against an eavesdropper. The communication is carried between two terminals using random photon polarization states represented through quantum states. Both these terminals are interconnected through disjoint quantum and classical channels. Information reconciliation using delay controlled joint decoding is performed at the receiving terminal. Its performance is characterized using data and error rates. Achieving low error rates is particularly challenging for schemes based on error correcting codes with short code lengths. This article addresses the decoding process using ordered statistics decoding for information reconciliation of both short and medium length Bose–Chaudhuri–Hocquenghem codes over a QKD link. The link’s quantum channel is modeled as a binary symmetric quantum depolarization channel, whereas the classical channel is configured with additive white Gaussian noise. Our results demonstrate the achievement of low bit error rates, and reduced decoding complexity when compared to other capacity achieving codes of similar length and configuration.


2021 ◽  
Vol 20 (3) ◽  
Author(s):  
Bang-Ying Tang ◽  
Bo Liu ◽  
Wan-Rong Yu ◽  
Chun-Qing Wu

AbstractInformation reconciliation (IR) corrects the errors in sifted keys and ensures the correctness of quantum key distribution (QKD) systems. Polar codes-based IR schemes can achieve high reconciliation efficiency; however, the incidental high frame error rate decreases the secure key rate of QKD systems. In this article, we propose a Shannon-limit approached (SLA) IR scheme, which mainly contains two phases: the forward reconciliation phase and the acknowledgment reconciliation phase. In the forward reconciliation phase, the sifted key is divided into sub-blocks and performed with the improved block checked successive cancellation list decoder of polar codes. Afterward, only the failure corrected sub-blocks perform the additional acknowledgment reconciliation phase, which decreases the frame error rate of the SLA IR scheme. The experimental results show that the overall failure probability of SLA IR scheme is decreased to $$10^{-8}$$ 10 - 8 and the efficiency is improved to 1.091 with the IR block length of 128 Mb. Furthermore, the efficiency of the proposed SLA IR scheme is 1.055, approached to Shannon limit, when the quantum bit error rate is 0.02 and the input scale of 1 Gb, which is hundred times larger than the state-of-the-art implemented polar codes-based IR schemes.


2021 ◽  
Vol 11 (1) ◽  
Author(s):  
Jianguo Xie ◽  
Han Wu ◽  
Chao Xia ◽  
Peng Ding ◽  
Helun Song ◽  
...  

AbstractSemiconductor superlattice secure key distribution (SSL-SKD) has been experimentally demonstrated to be a novel scheme to generate and agree on the identical key in unconditional security just by public channel. The error correction in the information reconciliation procedure is introduced to eliminate the inevitable differences of analog systems in SSL-SKD. Nevertheless, the error correction has been proved to be the performance bottleneck of information reconciliation for high computational complexity. Hence, it determines the final secure key throughput of SSL-SKD. In this paper, different frequently-used error correction codes, including BCH codes, LDPC codes, and Polar codes, are optimized separately to raise the performance, making them usable in practice. Firstly, we perform multi-threading to support multi-codeword decoding for BCH codes and Polar codes and updated value calculation for LDPC codes. Additionally, we construct lookup tables to reduce redundant calculations, such as logarithmic table and antilogarithmic table for finite field computation. Our experimental results reveal that our proposed optimization methods can significantly promote the efficiency of SSL-SKD, and three error correction codes can reach the throughput of Mbps and provide a minimum secure key rate of 99%.


2021 ◽  
Vol 25 (1) ◽  
pp. 79-83
Author(s):  
Evgeniy O. Kiktenko ◽  
Aleksei O. Malyshev ◽  
Aleksey K. Fedorov

2020 ◽  
Vol 25 (6) ◽  
pp. 793-801
Author(s):  
Maturi Sreerama Murty ◽  
Nallamothu Nagamalleswara Rao

Following the accessibility of Resource Description Framework (RDF) resources is a key capacity in the establishment of Linked Data frameworks. It replaces center around information reconciliation contrasted with work rate. Exceptional Connected Data that empowers applications to improve by changing over legacy information into RDF resources. This data contains bibliographic, geographic, government, arrangement, and alternate routes. Regardless, a large portion of them don't monitor the subtleties and execution of each sponsored resource. In such cases, it is vital for those applications to track, store and scatter provenance information that mirrors their source data and introduced tasks. We present the RDF information global positioning framework. Provenance information is followed during the progress cycle and oversaw multiple times. From that point, this data is appropriated utilizing of this concept URIs. The proposed design depends on the Harvard Library Database. The tests were performed on informational indexes with changes made to the qualities??In the RDF and the subtleties related with the provenance. The outcome has quieted the guarantee as in it pulls in record wholesalers to make significant realities that develop while taking almost no time and exertion.


Entropy ◽  
2020 ◽  
Vol 22 (10) ◽  
pp. 1087 ◽  
Author(s):  
Kun Zhang ◽  
Xue-Qin Jiang ◽  
Yan Feng ◽  
Runhe Qiu ◽  
Enjian Bai

Due to the rapid development of quantum computing technology, encryption systems based on computational complexity are facing serious threats. Based on the fundamental theorem of quantum mechanics, continuous-variable quantum key distribution (CVQKD) has the property of physical absolute security and can effectively overcome the dependence of the current encryption system on the computational complexity. In this paper, we construct the spatially coupled (SC)-low-density parity-check (LDPC) codes and quasi-cyclic (QC)-LDPC codes by adopting the parity-check matrices of LDPC codes in the Advanced Television Systems Committee (ATSC) 3.0 standard as base matrices and introduce these codes for information reconciliation in the CVQKD system in order to improve the performance of reconciliation efficiency, and then make further improvements to final secret key rate and transmission distance. Simulation results show that the proposed LDPC codes can achieve reconciliation efficiency of higher than 0.96. Moreover, we can obtain a high final secret key rate and a long transmission distance through using our proposed LDPC codes for information reconciliation.


Information merging may be a testing issue clinched alongside information reconciliation. The convenience of information builds when it is joined Also combined for other information from various (Web) wellsprings. The guarantee from claiming enormous information hinges upon tending to a few enormous information coordination challenges, for example, such that record linkage toward scale, ongoing information fusion, What's more coordinating profound Web. In spite of significantly fill in need been directed with respect to these problems, there may be constrained worth of effort on making An uniform, standard record from an assembly for records relating of the similar genuine world substance. Author allude with this errand as document standardization. Such a record illustration, ‘coined normalized record, may be essential for both front end and back end provisions’. In this paper, author formalize those record standardization problem, available in-depth dissection from claiming standardization granularity levels Also for standardization types. We recommend a thorough structure to registering the normalized record. Those suggested schema incorporates a suit of shield from claiming record standardization methods, from credulous ones, which utilize best the data assembled starting with records themselves, to complex strategies, which comprehensively mine an assembly about copy records when selecting a quality for a trait of a normalized record.


Sign in / Sign up

Export Citation Format

Share Document