Paper ID | SS-8.2 |
Paper Title |
LEARNED DECIMATION FOR NEURAL BELIEF PROPAGATION DECODERS |
Authors |
Andreas Buchberger, Christian Häger, Chalmers University of Technology, Sweden; Henry D. Pfister, Duke University, United States; Laurent Schmalen, Karlsruhe Institute of Technology, Germany; Alexandre Graell i Amat, Chalmers University of Technology, Sweden |
Session | SS-8: Near-ML Decoding of Error-correcting Codes: Algorithms and Implementation |
Location | Gather.Town |
Session Time: | Wednesday, 09 June, 16:30 - 17:15 |
Presentation Time: | Wednesday, 09 June, 16:30 - 17:15 |
Presentation |
Poster
|
Topic |
Special Sessions: Near-ML Decoding of Error-correcting Codes: Algorithms and Implementation |
IEEE Xplore Open Preview |
Click here to view in IEEE Xplore |
Virtual Presentation |
Click here to watch in the Virtual Conference |
Abstract |
We introduce a two-stage decimation process to improve the performance of neural belief propagation (NBP), recently introduced by Nachmani et al., for short low-density parity-check (LDPC) codes. In the first stage, we build a list by iterating between a conventional NBP decoder and guessing the least reliable bit. The second stage iterates between a conventional NBP decoder and learned decimation, where we use a neural network to decide the decimation value for each bit. For a (128,64) LDPC code, the proposed NBP with decimation outperforms NBP decoding by 0.75 dB and performs within 1 dB from maximum-likelihood decoding at a block error rate of 10^(-4). |