2021
DOI: 10.48550/arxiv.2107.04221
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Neural-Network-Optimized Degree-Specific Weights for LDPC MinSum Decoding

Abstract: Neural Normalized MinSum (N-NMS) decoding delivers better frame error rate (FER) performance on linear block codes than conventional normalized MinSum (NMS) by assigning dynamic multiplicative weights to each check-to-variable message in each iteration. Previous N-NMS efforts have primarily investigated short-length block codes (N < 1000), because the number of N-NMS parameters to be trained is proportional to the number of edges in the parity check matrix and the number of iterations, which imposes am impract… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
4
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(7 citation statements)
references
References 16 publications
0
4
0
Order By: Relevance
“…The weight coefficient sharing algorithm, similar to the pruning and compression algorithm, is a low-complexity technique commonly used in channel neural decoding [5], [9], [10]. We apply two typical weight coefficient sharing methods [9], namely temporal weight sharing and spatial weight sharing, to the LDPC decoding algorithm based on model-driven deep learning and proposed a low complexity LDPC neural decoding algorithm.…”
Section: Weight Sharing Nnms+ Algorithmmentioning
confidence: 99%
See 1 more Smart Citation
“…The weight coefficient sharing algorithm, similar to the pruning and compression algorithm, is a low-complexity technique commonly used in channel neural decoding [5], [9], [10]. We apply two typical weight coefficient sharing methods [9], namely temporal weight sharing and spatial weight sharing, to the LDPC decoding algorithm based on model-driven deep learning and proposed a low complexity LDPC neural decoding algorithm.…”
Section: Weight Sharing Nnms+ Algorithmmentioning
confidence: 99%
“…The authors in [9] proposed a weight sharing method for learning BP decoding algorithm, the weight sharing method can be divided into two types, namely temporal weight sharing and spatial weight sharing. The authors in [10] applied weight sharing to the decoding algorithm of long LDPC codes, optimized the parameters according to the degree of nodes, and proposed Neural 2-dimensional Normalized decoders. The authors in [11] used the weight sharing algorithm in the proposed NNMS decoding algorithm, and then quantized the parameters of the neural decoding network using the bit quantization algorithm.…”
Section: Introductionmentioning
confidence: 99%
“…To address the issue of high decoding complexity caused by multiple weight parameters, eliminating redundant parameters is crucial. Wang et al [32] mentioned a neural two-dimensional normalized minimum sum decoder with a simplified parameter set, which allows for assigning the same weight to a class of similar messages. Lian et al [33] introduced a parameter sharing scheme in the weighted belief propagation [27] decoder, where certain edges share the same weights to reduce storage and computational burden.…”
Section: Introductionmentioning
confidence: 99%
“…Furthermore, for a class of protograph-based 5G LDPC codes, whose structure is quasi cyclic, [21], [22] explored to apply machine learning techniques with fully sharing edge weights for long block codes. Besides that, for a class of irregular LDPC codes, it was proposed in [23] to assign and share weights in terms of its degree distribution for the NNMS decoder.…”
Section: Introductionmentioning
confidence: 99%