2006
DOI: 10.1109/tit.2006.872855
|View full text |Cite
|
Sign up to set email alerts
|

Raptor codes on binary memoryless symmetric channels

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

4
385
0
1

Year Published

2008
2008
2020
2020

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 357 publications
(390 citation statements)
references
References 14 publications
4
385
0
1
Order By: Relevance
“…For that δ, by means of linear programming (discretization of the segment), we have found the distributions Φ with minimum average degree. This method can be naturally extended to the finite length design, similarily as in [3], based on setting lower bounds on the expected size of the input ripple [11]. Table I shows four distributions calculated in this way.…”
Section: Construction and Asymptotic Analysis Of Incoming Distribumentioning
confidence: 99%
See 4 more Smart Citations
“…For that δ, by means of linear programming (discretization of the segment), we have found the distributions Φ with minimum average degree. This method can be naturally extended to the finite length design, similarily as in [3], based on setting lower bounds on the expected size of the input ripple [11]. Table I shows four distributions calculated in this way.…”
Section: Construction and Asymptotic Analysis Of Incoming Distribumentioning
confidence: 99%
“…This means that y = BIAW GN σV (x), where BIAWGN stands for the binary input additive white Gaussian noise channel (this is the virtual channel). Note that in this case we have H(X|Y ) = 1 − Cap (BIAW GN σV ), where the capacity of BIAWGN channel of noise variance σ 2 [3] is given by…”
Section: Biawgn Case -Soft Information Decodingmentioning
confidence: 99%
See 3 more Smart Citations