2022
DOI: 10.48550/arxiv.2203.13552
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Quantized Guessing Random Additive Noise Decoding

Abstract: We introduce a soft-detection variant of Guessing Random Additive Noise Decoding (GRAND) called Quantized GRAND (QGRAND) that can efficiently decode any moderate redundancy block-code in an algorithm that is suitable for highly parallelized implementation in hardware. QGRAND can avail of any level of quantized soft information, and is shown to provide near maximum likelihood decoding performance when provided with five or more bits of soft information per received bit.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 36 publications
0
1
0
Order By: Relevance
“…In general, for a probability of bit flip less than 1/2, in the absence of soft information or further channel knowledge, noise query follows increasing Hamming weights, as in the case of hard-detection GRAND. Adopting the quantized GRAND algorithm that avails of any level of quantized soft information [25] is also promising with coarse pseudo-soft information.…”
Section: Performance Evaluationmentioning
confidence: 99%
“…In general, for a probability of bit flip less than 1/2, in the absence of soft information or further channel knowledge, noise query follows increasing Hamming weights, as in the case of hard-detection GRAND. Adopting the quantized GRAND algorithm that avails of any level of quantized soft information [25] is also promising with coarse pseudo-soft information.…”
Section: Performance Evaluationmentioning
confidence: 99%