2016 IEEE 27th Annual International Symposium on Personal, Indoor, and Mobile Radio Communications (PIMRC) 2016
DOI: 10.1109/pimrc.2016.7794915
| View full text |Cite
|
Sign up to set email alerts
|

Abstract: Abstract-Random Linear Network Coding (RLNC) has been shown to be a technique with several benefits, in particular when applied over wireless mesh networks, since it provides robustness against packet losses. On the other hand, Tunable Sparse Network Coding (TSNC) is a promising concept, which leverages a trade-off between computational complexity and goodput. An optimal density tuning function has not been found yet, due to the lack of a closed-form expression that links density, performance and computational… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
6
1

Relationship

1
6

Authors

Journals

citations
Cited by 10 publications
(2 citation statements)
references
References 18 publications
0
2
0
Order By: Relevance
“…There have been many other sparse variants of random linear network coding, including chunked codes (e.g., [ 28 , 29 ]), tunable sparse network coding (e.g., [ 30 , 31 ], and sliding-window coding (e.g., [ 32 , 33 , 34 , 35 , 36 ]). While many of these codes can also be applied, BATS codes are more suitable for this distributed computing scenario.…”
Section: Bats-code-based Approachmentioning
confidence: 99%
“…There have been many other sparse variants of random linear network coding, including chunked codes (e.g., [ 28 , 29 ]), tunable sparse network coding (e.g., [ 30 , 31 ], and sliding-window coding (e.g., [ 32 , 33 , 34 , 35 , 36 ]). While many of these codes can also be applied, BATS codes are more suitable for this distributed computing scenario.…”
Section: Bats-code-based Approachmentioning
confidence: 99%
“…This, in programming implementation, still imposes high memory requirement for efficient random access of sparse matrix elements [ 21 ], otherwise the pivoting speed is significantly sacrificed. In practice, even for a moderate as a few hundreds, the decoding speed of sparse GE can be unsatisfactory [ 22 ].…”
Section: Introductionmentioning
confidence: 99%