1967
DOI: 10.1016/s0019-9958(67)91200-4
|View full text |Cite
|
Sign up to set email alerts
|

Lower bounds to error probability for coding on discrete memoryless channels. II

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

2
253
0
2

Year Published

1969
1969
2014
2014

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 266 publications
(257 citation statements)
references
References 6 publications
2
253
0
2
Order By: Relevance
“…Thus, by Theorem 7, for these values of we have . The full claim follows from the straight-line bound of Shannon, Gallager, and Berlekamp [27]. Remark: We have seen in Lemma 4 that for , it suffices to rely on the simple form of the function , namely, .…”
Section: Theorem 17mentioning
confidence: 96%
See 2 more Smart Citations
“…Thus, by Theorem 7, for these values of we have . The full claim follows from the straight-line bound of Shannon, Gallager, and Berlekamp [27]. Remark: We have seen in Lemma 4 that for , it suffices to rely on the simple form of the function , namely, .…”
Section: Theorem 17mentioning
confidence: 96%
“…Several important ideas in this problem were suggested in the paper [27]. The nature of the upper bounds is different for low values of and for close to capacity.…”
Section: A Error Exponentsmentioning
confidence: 99%
See 1 more Smart Citation
“…VI.E] to be equal to the sphere-packing exponent. However, below the critical rate of the channel, the reliability function is in general bounded away [7] from the sphere-packing exponent and thus, the gap between (52) and (53) may grow exponentially with n. 1 shows different bounds on the error probability for M = 4 messages and a binary symmetric channel (BSC). In this setup, the best code can be obtained explicitly [8] and hence the exact ML decoding error probability can be computed.…”
Section: Connection With Previous Resultsmentioning
confidence: 99%
“…Such would be the case, for instance, of random expurgated codes [16,17] and random constant-composition codes [18].…”
Section: Mathematical Modelmentioning
confidence: 99%