scholarly journals Low-complexity decoding of LDPC codes using reduced-set WBF-based algorithms

Author(s):  
Sadjad Haddadi ◽  
Mahmoud Farhang ◽  
Mostafa Derakhtian

Abstract We propose a method to substantially reduce the computational complexity of iterative decoders of low-density parity-check (LDPC) codes which are based on the weighted bit-flipping (WBF) algorithm. In this method, the WBF-based decoders are modified so that the flipping function is calculated only over a reduced set of variable nodes. An explicit expression for the achieved complexity gain is provided and it is shown that for a code of block length N, the decoding complexity is reduced from O(N2) to O(N). Moreover, we derive an upper bound for the difference in the frame error rate of the reduced-set decoders and the original WBF-based decoders, and it is shown that the error performances of the two decoders are essentially the same.

2021 ◽  
Vol 2021 ◽  
pp. 1-11
Author(s):  
Chakir Aqil ◽  
Ismail Akharraz ◽  
Abdelaziz Ahaitouf

In this study, we propose a “New Reliability Ratio Weighted Bit Flipping” (NRRWBF) algorithm for Low-Density Parity-Check (LDPC) codes. This algorithm improves the “Reliability Ratio Weighted Bit Flipping” (RRWBF) algorithm by modifying the reliability ratio. It surpasses the RRWBF in performance, reaching a 0.6 dB coding gain at a Binary Error Rate (BER) of 10−4 over the Additive White Gaussian Noise (AWGN) channel, and presents a significant reduction in the decoding complexity. Furthermore, we improved NRRWBF using the sum of the syndromes as a criterion to avoid the infinite loop. This will enable the decoder to attain a more efficient and effective decoding performance.


2018 ◽  
Vol 7 (03) ◽  
pp. 23781-23784
Author(s):  
Rajarshini Mishra

Low-density parity-check (LDPC) have been shown to have good error correcting performance approaching Shannon’s limit. Good error correcting performance enables efficient and reliable communication. However, a LDPC code decoding algorithm needs to be executed efficiently to meet cost , time, power and bandwidth requirements of target applications. Quasi-cyclic low-density parity-check (QC-LDPC) codes are an important subclass of LDPC codes that are known as one of the most effective error controlling methods. Quasi cyclic codes are known to possess some degree of regularity. Many important communication standards such as DVB-S2 and 802.16e use these codes. The proposed Optimized Min-Sum decoding algorithm performs very close to the Sum-Product decoding while preserving the main features of the Min-Sum decoding, that is low complexity and independence with respect to noise variance estimation errors.Proposed decoder is well matched for VLSI implementation and will be implemented on Xilinx FPGA family


Author(s):  
Rana A. Hassan ◽  
John P. Fonseka

Background: Low-density parity-check (LDPC) codes have received significant interest in a variety of communication systems due to their superior performance and reasonable decoding complexity. Methods: A novel collection of punctured codes decoding (CPCD) technique that considers a code as a collection of its punctured codes is proposed. Two forms of CPCD, serial CPCD that decodes each punctured code serially and parallel CPCD that decodes each punctured code in parallel, are discussed. Results: It is demonstrated that both serial and parallel CPCD have about the same decoding complexity compared with standard sum product algorithm (SPA) decoding. It is also demonstrated that while serial CPCD has about the same decoding delay compared with standard SPA decoding, parallel CPCD can decrease the decoding delay, however, at the expense of processing power. Conclusion: Numerical results demonstrate that CPCD can significantly improve the performance, or significantly increase the code rate of low-density parity-check (LDPC) codes.


2013 ◽  
Vol 710 ◽  
pp. 723-726
Author(s):  
Yuan Hua Liu ◽  
Mei Ling Zhang

A novel bit-flipping (BF) algorithm with low complexity for high-throughput decoding of low-density parity-check (LDPC) codes is presented. At each iteration, a novel threshold pattern is used to determine the code bits whether to be flipped or not, and the flipping error probability is effectively decreased. Compared with the weighted BF algorithm and its modifications, the modified BF algorithm has significantly lower complexity and decoding time. Through simulations the proposed BF algorithm is shown to achieve excellent performance and fast convergence speed while maintaining significantly low complexity thus facilitating high-throughput decoding.


2020 ◽  
Vol 2 (1) ◽  
pp. 42-49 ◽  
Author(s):  
Dr. Joy Iong Zong Chen

The 5G mobile communication standard based radio access technology (RAT) is analysed for implementation of several candidate coding schemes in this paper. The third generation partnership project (3GPP) in the 5G scenario based on the Enhanced mobile broadband (eMBB) scheme is considered. Factors like flexibility, complexity of computation, bit error rate (BER), and block error rate (BLER) are considered for the purpose of evaluation of the coding schemes. In order to evaluate the performance various applications and services, a suitable set is of parameters are provided. The candidate schemes considered for this purpose are polar codes, low density parity check (LDPC) and turbo codes. Fair comparison is performed by investigation of block lengths and obtaining suitable rates by proper design. In an additive white Gaussian noise (AWGN) channel, the performance of BLER / BER is obtained for diverse block lengths and code rates based on simulation. The simulation results show that the performance of LDPC is relatively efficient for various code rates and block lengths despite the better performance of polar codes at short block lengths. As an added advantage, LDPC codes also offer relatively low complexity.


2007 ◽  
Vol 17 (01) ◽  
pp. 103-123 ◽  
Author(s):  
JAMES S. PLANK ◽  
MICHAEL G. THOMASON

As peer-to-peer and widely distributed storage systems proliferate, the need to perform efficient erasure coding, instead of replication, is crucial to performance and efficiency. Low-Density Parity-Check (LDPC) codes have arisen as alternatives to standard erasure codes, such as Reed-Solomon codes, trading off vastly improved decoding performance for inefficiencies in the amount of data that must be acquired to perform decoding. The scores of papers written on LDPC codes typically analyze their collective and asymptotic behavior. Unfortunately, their practical application requires the generation and analysis of individual codes for finite systems. This paper attempts to illuminate the practical considerations of LDPC codes for peer-to-peer and distributed storage systems. The three main types of LDPC codes are detailed, and a huge variety of codes are generated, then analyzed using simulation. This analysis focuses on the performance of individual codes for finite systems, and addresses several important heretofore unanswered questions about employing LDPC codes in real-world systems.


Sign in / Sign up

Export Citation Format

Share Document