2009
DOI: 10.1145/1467247.1467269
|View full text |Cite
|
Sign up to set email alerts
|

Error correction up to the information-theoretic limit

Abstract: Ever since the birth of coding theory almost 60 years ago, researchers have been pursuing the elusive goal of constructing the "best codes," whose encoding introduces the minimum possible redundancy for the level of noise they can correct. In this article, we survey recent progress in list decoding that has led to efficient error-correction schemes with an optimal amount of redundancy, even against worst-case errors caused by a potentially malicious channel. To correct a proportion p (say 20%) of worst-case er… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
7
0

Year Published

2010
2010
2019
2019

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 11 publications
(7 citation statements)
references
References 17 publications
0
7
0
Order By: Relevance
“…It is not the purpose of the present paper to provide comprehensive analyses and evaluations of Guruswami–Sudan algorithms; therefore we see [36]. However, in order to design a decoding mechanism, we describe some experiments that we conducted.…”
Section: Implementation For Multiple Fingerprintsmentioning
confidence: 99%
“…It is not the purpose of the present paper to provide comprehensive analyses and evaluations of Guruswami–Sudan algorithms; therefore we see [36]. However, in order to design a decoding mechanism, we describe some experiments that we conducted.…”
Section: Implementation For Multiple Fingerprintsmentioning
confidence: 99%
“…BEC(p i ). Let's denote this channel by BEC(p) 5 . In this setting we would like to analyze the probability that the MAP decoder is unable to decode the i-th bit and then try to get a bound on the probability of error for the block MAP decoder.…”
Section: Channelsmentioning
confidence: 99%
“…While there are means to gain greater storage efficiency without traditional RAID storage subsystems, such as erasure encoding [27], these methods are still relatively novel and therefore have not undergone the rigor of extensive customer use typified by RAID models. If the object store is to be used as the final home for object data, the means of data protection must be solid or it risks data loss.…”
Section: Data Protectionmentioning
confidence: 99%