2022
DOI: 10.1109/lcomm.2022.3207506
|View full text |Cite
|
Sign up to set email alerts
|

A Low-Complexity Neural Normalized Min-Sum LDPC Decoding Algorithm Using Tensor-Train Decomposition

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
22
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
2

Relationship

2
3

Authors

Journals

citations
Cited by 10 publications
(22 citation statements)
references
References 10 publications
0
22
0
Order By: Relevance
“…In [16], we propose the NNMS+ algorithm, which uses different weight coefficients for different edges based on the NNMS algorithm [3]. In NNMS decoding algorithm, the calculation of messages from check nodes (CNs) to variable nodes (VNs) for iteration t, denoted u t c,v , is given as,…”
Section: Preliminaries a Tt-nnms+ Ldpc Decoding Algorithmmentioning
confidence: 99%
See 4 more Smart Citations
“…In [16], we propose the NNMS+ algorithm, which uses different weight coefficients for different edges based on the NNMS algorithm [3]. In NNMS decoding algorithm, the calculation of messages from check nodes (CNs) to variable nodes (VNs) for iteration t, denoted u t c,v , is given as,…”
Section: Preliminaries a Tt-nnms+ Ldpc Decoding Algorithmmentioning
confidence: 99%
“…In [16], we modify the calculation of the messages from CNs to VNs for iteration t, denoted as U t c,v , by adding a variable weight vector, denoted as…”
Section: Preliminaries a Tt-nnms+ Ldpc Decoding Algorithmmentioning
confidence: 99%
See 3 more Smart Citations