2019
DOI: 10.1109/tit.2019.2896110
|View full text |Cite
|
Sign up to set email alerts
|

Capacity-Achieving Guessing Random Additive Noise Decoding

Abstract: Guessing Random Additive Noise Decoding (GRAND) can, unusually, decode any forward error correction block code. The original algorithm assumed that the decoder received only hard decision demodulated to inform its decoding.As the incorporation of soft information is known to improve decoding precision, here we introduce Ordered Reliability Bits GRAND, that, for binary block code of length n, avails of no more than log 2 (n) bits of code-book-independent quantized soft detection information per received bit to … Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
133
0
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 146 publications
(141 citation statements)
references
References 79 publications
0
133
0
1
Order By: Relevance
“…Code-centric decoders attempt to identify X n given Y n by exploiting the structure of a specific codebook. GRAND's universality stems from the observation that if one can determine the effect of the noise, N n in equation ( 1), one can deduce X n , and doing so is more efficient for low or moderate redundancy codes [18]. To identify the noise effect N n , GRAND generates a series of putative noise sequences E n , subtracting them in turn from the received bits Y n and checking if the resulting Y n −E n is a member of the codebook.…”
Section: Guessing Random Additive Noise Decodingmentioning
confidence: 99%
See 1 more Smart Citation
“…Code-centric decoders attempt to identify X n given Y n by exploiting the structure of a specific codebook. GRAND's universality stems from the observation that if one can determine the effect of the noise, N n in equation ( 1), one can deduce X n , and doing so is more efficient for low or moderate redundancy codes [18]. To identify the noise effect N n , GRAND generates a series of putative noise sequences E n , subtracting them in turn from the received bits Y n and checking if the resulting Y n −E n is a member of the codebook.…”
Section: Guessing Random Additive Noise Decodingmentioning
confidence: 99%
“…Guessing Random Additive Noise Decoding (GRAND) is a recently proposed class of universal decoding algorithms that challenges those limitations. Both hard and soft-detection variants [18]- [21] have been developed that, in theory and simulation, offer optimally precise Maximum Likelihood (ML) decoding for any moderate redundancy code. If GRAND algorithms can be translated into efficient hardware, they offer the possibility of decoding any code, regardless of whether it only has a hard or soft-detection decoder presently, in a single instantiation.…”
Section: Introductionmentioning
confidence: 99%
“…GRAND is a recently proposed maximum likelihood (ML) decoding algorithm [9], that is suitable for decoding any short and high-rate code. Unlike conventional decoders, GRAND is a universal decoder; decoding is not tailored to the structure of a particular code.…”
Section: Guessing Random Additive Noise Decodingmentioning
confidence: 99%
“…Therefore, first the error patterns of Hamming weight one are considered, then those of weight two, and so on. To reduce the overall complexity, GRAND with abandonment (GRANDAB) was also proposed in [9]. In that case, if no valid codeword are found after considering all the error patterns with a Hamming weight less than or equal to AB, the decoder declares a failure.…”
Section: Guessing Random Additive Noise Decodingmentioning
confidence: 99%
“…Fig. 2 plots the FER performance for GRAND-MO and GRANDAB [7] decoding of RLCs of length n = 128. We can observe that with the decrease in g, GRAND-MO outperforms GRANDAB (AB = 3) decoder in FER performance.…”
Section: Grand Markov Ordermentioning
confidence: 99%