ldpc code
Recently Published Documents


TOTAL DOCUMENTS

605
(FIVE YEARS 111)

H-INDEX

26
(FIVE YEARS 4)

Symmetry ◽  
2021 ◽  
Vol 13 (12) ◽  
pp. 2338
Author(s):  
Chuntao Wang ◽  
Renxin Liang ◽  
Shancheng Zhao ◽  
Shan Bian ◽  
Zhimao Lai

Nowadays, it remains a major challenge to efficiently compress encrypted images. In this paper, we propose a novel encryption-then-compression (ETC) scheme to enhance the performance of lossy compression on encrypted gray images through heuristic optimization of bitplane allocation. Specifically, in compressing an encrypted image, we take a bitplane as a basic compression unit and formulate the lossy compression task as an optimization problem that maximizes the peak signal-to-noise ratio (PSNR) subject to a given compression ratio. We then develop a heuristic strategy of bitplane allocation to approximately solve this optimization problem, which leverages the asymmetric characteristics of different bitplanes. In particular, an encrypted image is divided into four sub-images. Among them, one sub-image is reserved, while the most significant bitplanes (MSBs) of the other sub-images are selected successively, and so are the second, third, etc., MSBs until a given compression ratio is met. As there exist clear statistical correlations within a bitplane and between adjacent bitplanes, where bitplane denotes those belonging to the first three MSBs, we further use the low-density parity-check (LDPC) code to compress these bitplanes according to the ETC framework. In reconstructing the original image, we first deploy the joint LDPC decoding, decryption, and Markov random field (MRF) exploitation to recover the chosen bitplanes belonging to the first three MSBs in a lossless way, and then apply content-adaptive interpolation to further obtain missing bitplanes and thus discarded pixels, which is symmetric to the encrypted image compression process. Experimental simulation results show that the proposed scheme achieves desirable visual quality of reconstructed images and remarkably outperforms the state-of-the-art ETC methods, which indicates the feasibility and effectiveness of the proposed scheme.


Author(s):  
Mouhcine Razi ◽  
Mhammed Benhayoun ◽  
Anass Mansouri ◽  
Ali Ahaitouf

<span lang="EN-US">For low density parity check (LDPC) decoding, hard-decision algorithms are sometimes more suitable than the soft-decision ones. Particularly in the high throughput and high speed applications. However, there exists a considerable gap in performances between these two classes of algorithms in favor of soft-decision algorithms.  In order to reduce this gap, in this work we introduce two new improved versions of the hard-decision algorithms, the adaptative gradient descent bit-flipping (AGDBF) and adaptative reliability ratio weighted GDBF (ARRWGDBF).  An adaptative weighting and correction factor is introduced in each case to improve the performances of the two algorithms allowing an important gain of bit error rate. As a second contribution of this work a real time implementation of the proposed solutions on a digital signal processors (DSP) is performed in order to optimize and improve the performance of these new approchs. The results of numerical simulations and DSP implementation reveal a faster convergence with a low processing time and a reduction in consumed memory resources when compared to soft-decision algorithms. For the irregular LDPC code, our approachs achieves gains of 0.25 and 0.15 dB respectively for the AGDBF and ARRWGDBF algorithms.</span>


2021 ◽  
Vol 2078 (1) ◽  
pp. 012075
Author(s):  
Chong-Yue Shi ◽  
Hui Li ◽  
You-Ling zhou ◽  
Ping Wang ◽  
Qian Li ◽  
...  

Abstract In this paper, LDPC codes and OFDM technologies are studied, and combines LDPC and OFDM to simulate and verify the performance. The encoding of LDPC and logarithmic domain BP decoding algorithm and the principle of OFDM are introduced. Through MATLAB, the simulation of OFDM-LDPC system in different channel, different code length, different iteration times, LDPC and OFDM-LDPC in the same conditions are compared. Through simulation analysis, we can know that the performance of OFDM-LDPC system will improve with the increase of the code length. When the SNR is small, the number of iterations has little effect on the performance of OFDM-LDPC system. When the SNR is large, the performance of the system will be improved with the increase of the number of iterations. The performance of OFDM-LDPC codes is better than that of LDPC codes. The performance of OFDM-LDPC code is better than LDPC code in multipath Rayleigh fading channel.


2021 ◽  
Author(s):  
Ghasan Ali Hussain

Abstract In mobile communication systems, there are errors that will be generated in the digital signal due to fading and interference. Consequently, different techniques are used to improve the system's reliability and enhance the signal's robustness. Channel coding techniques are used to enhance the system reliability of 5G wireless communication systems . In the upcoming wireless technologies, LDPC codes are still introduced as an alternative to turbo codes. However, the error floor phenomenon is one of the biggest demerits of using LDPC code in the different communication systems that need low error rates. This paper uses RS codes with LDPC codes in a concatenated code to solve this demerit of LDPC codes. Meanwhile, a modified concatenated RS/LDPC codes are created using outer RS codes with inner LDPC codes then appended by interleaver, unlike the conventional concatenated codes that use the interleaver between both codes. Thereafter, the modified concatenated RS/LDPC codes were suggested to enhance BER performance for the f-OFDM system. The results showed that using the proposed concatenated code outperformed using single and familiar concatenated RS/LDPC code in terms of improving BER performance. Meanwhile, the proposed system achieved lower OOBE values than the conventional OFDM system. Therefore, the resulted system can be introduced as a competitor candidate for 5G wireless communication systems due to these features


2021 ◽  
Author(s):  
Cong Xie ◽  
Jian Yang ◽  
Hai Tian ◽  
Danfeng Zhao ◽  
Tongzhou Han

Author(s):  
Alireza Hasani ◽  
Lukasz Lopacinski ◽  
Rolf Kraemer

AbstractLayered decoding (LD) facilitates a partially parallel architecture for performing belief propagation (BP) algorithm for decoding low-density parity-check (LDPC) codes. Such a schedule for LDPC codes has, in general, reduced implementation complexity compared to a fully parallel architecture and higher convergence rate compared to both serial and parallel architectures, regardless of the codeword length or code-rate. In this paper, we introduce a modified shuffling method which shuffles the rows of the parity-check matrix (PCM) of a quasi-cyclic LDPC (QC-LDPC) code, yielding a PCM in which each layer can be produced by the circulation of its above layer one symbol to the right. The proposed shuffling scheme additionally guarantees the columns of a layer of the shuffled PCM to be either zero weight or single weight. This condition has a key role in further decreasing LD complexity. We show that due to these two properties, the number of occupied look-up tables (LUTs) on a field programmable gate array (FPGA) reduces by about 93% and consumed on-chip power by nearly 80%, while the bit error rate (BER) performance is maintained. The only drawback of the shuffling is the degradation of decoding throughput, which is negligible for low values of $$E_b/N_0$$ E b / N 0 until the BER of 1e−6.


2021 ◽  
Author(s):  
Debarnab Mitra ◽  
Lev Tauz ◽  
Lara Dolecek

<div>In blockchain systems, full nodes store the entire blockchain ledger and validate all transactions in the system by operating on the entire ledger. However, for better scalability and decentralization of the system, blockchains also run light nodes that only store a small portion of the ledger. In blockchain systems having a majority of malicious full nodes, light nodes are vulnerable to a data availability (DA) attack. In this attack, a malicious node makes the light nodes accept an invalid block by hiding the invalid portion of the block from the nodes in the system. Recently, a technique based on LDPC codes called Coded Merkle Tree (CMT) was proposed by Yu et al. that enables light nodes to detect a DA attack by randomly requesting/sampling portions of the block from the malicious node. However, light nodes fail to detect a DA attack with high probability if a malicious node hides a small stopping set of the LDPC code. To mitigate this problem, Yu et al. used well-studied techniques to design random LDPC codes with high minimum stopping set size. Although effective, these codes are not necessarily optimal for this application. In this paper, we demonstrate that a suitable co-design of specialized LDPC codes and the light node sampling strategy can improve the probability of detection of DA attacks. We consider different adversary models based on their computational capabilities of finding stopping sets in LDPC codes. For a weak adversary model, we devise a new LDPC code construction termed as the entropy-constrained PEG (EC-PEG) algorithm which concentrates stopping sets to a small group of variable nodes. We demonstrate that the EC-PEG algorithm coupled with a greedy sampling strategy improves the probability of detection of DA attacks. For stronger adversary models, we provide a co-design of a sampling strategy called linear-programming-sampling (LP-sampling) and an LDPC code construction called linear-programming-constrained PEG (LC-PEG) algorithm. The new co-design demonstrates a higher probability of detection of DA attacks compared to approaches proposed in earlier literature.</div>


Sign in / Sign up

Export Citation Format

Share Document