2017
DOI: 10.1109/lcomm.2017.2700303
|View full text |Cite
|
Sign up to set email alerts
|

Performance Analysis for Finite Length LT Codes via Classical Probability Evaluation

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 11 publications
(4 citation statements)
references
References 16 publications
0
4
0
Order By: Relevance
“…The necessary number of encoded packets to recuperate initial information, with K input symbols used in LT encoding, is 22,32…”
Section: Classification Methods and Decisionmentioning
confidence: 99%
See 1 more Smart Citation
“…The necessary number of encoded packets to recuperate initial information, with K input symbols used in LT encoding, is 22,32…”
Section: Classification Methods and Decisionmentioning
confidence: 99%
“…The average number of packets m opt to decode the message and recuperate initial information is the solution of the following optimization problem: argmaxm1emtruei=1mPfalse(Cpjfalse|pifalse),j=1,2, s.t.1emci=0. The necessary number of encoded packets to recuperate initial information, with K input symbols used in LT encoding, is N=K+scriptOfalse(Klog2Kfalse). …”
Section: System Model and Problem Formulationmentioning
confidence: 99%
“…The RSD exhibits signification improvement in performance over ISD. However, the RSD is not the optimal degree distribution for finite length LT codes [7]. Poisson RSD (PRSD) was designed by combining Poisson distribution (PD) and RSD to reduce average overhead and improve coding efficiency for LT codes [8].…”
Section: Introductionmentioning
confidence: 99%
“…LT codes were originally designed over binary erasure channels (BEC) with two decoding methods, namely, belief propagation (BP) and Gaussian elimination (GE). BP is a fast algorithm with complexity of O(klog(k)), and its performance of successful decoding is exploited in [10]. GE is a maximum likelihood decoding with complexity of O(k 2 ), which was further developed by Kim et al [11], [12] and Bioglio et al [13].…”
Section: Introductionmentioning
confidence: 99%