1996
DOI: 10.1109/18.532892
|View full text |Cite
|
Sign up to set email alerts
|

Nearest neighbor decoding for additive non-Gaussian noise channels

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

12
177
1

Year Published

1998
1998
2022
2022

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 198 publications
(190 citation statements)
references
References 15 publications
12
177
1
Order By: Relevance
“…We show that the dispersion term depends on the non-Gaussian noise only through its second and fourth moments, thus complementing the capacity result (Lapidoth, 1996), which depends only on the second moment. Furthermore, we characterize the second-order asymptotics of point-to-point codes over K-sender interference networks with non-Gaussian additive noise.…”
mentioning
confidence: 62%
See 2 more Smart Citations
“…We show that the dispersion term depends on the non-Gaussian noise only through its second and fourth moments, thus complementing the capacity result (Lapidoth, 1996), which depends only on the second moment. Furthermore, we characterize the second-order asymptotics of point-to-point codes over K-sender interference networks with non-Gaussian additive noise.…”
mentioning
confidence: 62%
“…Note that in traditional channel-coding analyses [2], [3], the probability of error is averaged only over W and Z n . Similar to [5], the additional averaging over the codebook C is required here to establish ensemble-tightness results for the two classes of Gaussian codebooks considered in this paper.…”
Section: Point-to-point Channels a System Model And Definitionsmentioning
confidence: 99%
See 1 more Smart Citation
“…As shown in [2] and the references therein, the Gaussian distribution is the worst memoryless noise possible given a specified noise power. Lapidoth [10] further showed that irrespective of the noise distribution and even regardless of whether the noise is i.i.d., the capacity assuming i.i.d. Gaussian noise is achievable with nearest-neighbor decoding and no rate above the i.i.d.…”
Section: B Approximation Of Noise In Successive Decodingsmentioning
confidence: 99%
“…codebooks, C bicm is also the largest rate that can be transmitted with vanishing error probability [17]. This capacity should be compared with the equivalent quantities on CM and MLC, given in Eqs.…”
Section: Achievable Rates With Bicmmentioning
confidence: 99%