Have a personal or library account? Click to login
Recurrent Neural Network-Aided BP Decoder Based on Bit-flipping for Polar Codes Cover

Recurrent Neural Network-Aided BP Decoder Based on Bit-flipping for Polar Codes

By: Guiping Li,  Chang Yun and  Xiaojie Liu  
Open Access
|Jun 2025

Figures & Tables

Figure 1.

Polarization Transformation.
Polarization Transformation.

Figure 2.

Factor diagram of BP decoding algorithm for (8,4) polar codes.
Factor diagram of BP decoding algorithm for (8,4) polar codes.

Figure 3.

Unit factor diagram of BP decoding algorithm for polar codes.
Unit factor diagram of BP decoding algorithm for polar codes.

Figure 4.

Binary Tree Structure Diagram(polar (32,16)).
Binary Tree Structure Diagram(polar (32,16)).

Figure 5.

A complete decoding iteration in a recurrent neural network with polar(4, 2).
A complete decoding iteration in a recurrent neural network with polar(4, 2).

Figure 6.

Three types of neurons in network architecture.
Three types of neurons in network architecture.

Figure 7.

Polar codes bit-flip BP decoder based on RNN.
Polar codes bit-flip BP decoder based on RNN.

Figure 8.

BLER performance of different algorithms.
BLER performance of different algorithms.

Memory overhead Analysis

Decoding algorithmMemory overhead
RNN-BPN(log2 N + 1) = 448
CS-BFN(log2N + 1) + ωTmax = 736
CNN-Tree-MBF2IN(log2 N + 1) + Tmax + K + 0.3M ≈ 0.3M
CA-SCLN + 3NLL = 1592
ProposedN(log2N + 1) + ωTmax = 469

Computational Complexity Analysis

Decoding algorithmMultiplication
RNN-BP2INlog2 N = 3840
CS-BF2INlog2 NTavg =144080
CNN-Tree-MBF2INlog2NTavg + 9.8M × TCNN
CA-SCLL(Nlog2N+3N2+5K2)12K=4464
Proposed0

Simulation parameter

Set optionsValue
Test platformTensorflow1.8.0
Encoding(64,32)
CRC Generator Polynomialx6+x5+1
Training codeword7.8×104
Testing codeword3.4×103
Batch_size3600
Loss functionCross entropy
OptimizerAdam
Language: English
Page range: 38 - 49
Published on: Jun 13, 2025
In partnership with: Paradigm Publishing Services
Publication frequency: 4 issues per year

© 2025 Guiping Li, Chang Yun, Xiaojie Liu, published by Xi’an Technological University
This work is licensed under the Creative Commons Attribution-NonCommercial-NoDerivatives 4.0 License.