scholarly journals List Decoding of Arıkan’s PAC Codes

Entropy ◽  
2021 ◽  
Vol 23 (7) ◽  
pp. 841
Author(s):  
Hanwen Yao ◽  
Arman Fazeli ◽  
Alexander Vardy

Polar coding gives rise to the first explicit family of codes that provably achieve capacity with efficient encoding and decoding for a wide range of channels. However, its performance at short blocklengths under standard successive cancellation decoding is far from optimal. A well-known way to improve the performance of polar codes at short blocklengths is CRC precoding followed by successive-cancellation list decoding. This approach, along with various refinements thereof, has largely remained the state of the art in polar coding since it was introduced in 2011. Recently, Arıkan presented a new polar coding scheme, which he called polarization-adjusted convolutional (PAC) codes. At short blocklengths, such codes offer a dramatic improvement in performance as compared to CRC-aided list decoding of conventional polar codes. PAC codes are based primarily upon the following main ideas: replacing CRC codes with convolutional precoding (under appropriate rate profiling) and replacing list decoding by sequential decoding. One of our primary goals in this paper is to answer the following question: is sequential decoding essential for the superior performance of PAC codes? We show that similar performance can be achieved using list decoding when the list size L is moderately large (say, L⩾128). List decoding has distinct advantages over sequential decoding in certain scenarios, such as low-SNR regimes or situations where the worst-case complexity/latency is the primary constraint. Another objective is to provide some insights into the remarkable performance of PAC codes. We first observe that both sequential decoding and list decoding of PAC codes closely match ML decoding thereof. We then estimate the number of low weight codewords in PAC codes, and use these estimates to approximate the union bound on their performance. These results indicate that PAC codes are superior to both polar codes and Reed–Muller codes. We also consider random time-varying convolutional precoding for PAC codes, and observe that this scheme achieves the same superior performance with constraint length as low as ν=2.

2019 ◽  
Vol 23 (10) ◽  
pp. 1757-1760
Author(s):  
Jiahao Wang ◽  
Zhenyu Hu ◽  
Ning An ◽  
Dunfan Ye

2019 ◽  
pp. 1-1 ◽  
Author(s):  
Xiumin Wang ◽  
Ting Wang ◽  
Jun Li ◽  
Liang Shan ◽  
Haiyan Cao ◽  
...  

IEEE Access ◽  
2020 ◽  
Vol 8 ◽  
pp. 96955-96962
Author(s):  
Kyungpil Lee ◽  
In-Cheol Park

Entropy ◽  
2019 ◽  
Vol 21 (9) ◽  
pp. 899
Author(s):  
Xiumin Wang ◽  
Jinlong He ◽  
Jun Li ◽  
Zhuoting Wu ◽  
Liang Shan ◽  
...  

Although the adaptive successive cancellation list (AD-SCL) algorithm and the segmented-CRC adaptive successive cancellation list (SCAD-SCL) algorithm based on the cyclic redundancy check (CRC) can greatly reduce the computational complexity of the successive cancellation list (SCL) algorithm, these two algorithms discard the previous decoding result and re-decode by increasing L, where L is the size of list. When CRC fails, these two algorithms waste useful information from the previous decoding. In this paper, a simplified adaptive successive cancellation list (SAD-SCL) is proposed. Before the re-decoding of updating value L each time, SAD-SCL uses the existing log likelihood ratio (LLR) information to locate the range of burst error bits, and then re-decoding starts at the incorrect bit with the smallest index in this range. Moreover, when the segmented information sequence cannot get the correct result of decoding, the SAD-SCL algorithm uses SC decoding to complete the decoding of the subsequent segmentation information sequence. Furthermore, its decoding performance is almost the same as that of the subsequent segmentation information sequence using the AD-SCL algorithm. The simulation results show that the SAD-SCL algorithm has lower computational complexity than AD-SCL and SCAD-SCL with negligible loss of performance.


2017 ◽  
Vol 14 (18) ◽  
pp. 20170735-20170735 ◽  
Author(s):  
Kun Wang ◽  
Li Li ◽  
Feng Han ◽  
Fan Feng ◽  
Jun Lin ◽  
...  

Sign in / Sign up

Export Citation Format

Share Document