2017
DOI: 10.1109/tcomm.2017.2657621
|View full text |Cite
|
Sign up to set email alerts
|

Markov Chain Model for the Decoding Probability of Sparse Network Coding

Abstract: Abstract-Random Linear Network Coding (RLNC) has been proved to offer an efficient communication scheme, leveraging an interesting robustness against packet losses. However, it suffers from a high computational complexity and some novel approaches, which follow the same idea, have been recently proposed. One of such solutions is Tunable Sparse Network Coding (TSNC), where only few packets are combined in each transmissions. The amount of data packets to be combined in each transmissions can be set from a densi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
21
0
2

Year Published

2017
2017
2024
2024

Publication Types

Select...
3
3
2

Relationship

2
6

Authors

Journals

citations
Cited by 38 publications
(23 citation statements)
references
References 38 publications
(36 reference statements)
0
21
0
2
Order By: Relevance
“…where α is a constant value, which does not depend on the generation size, when k 1, but it varies with the Galois Field, as it is derived in [12]. For the binary case, GF (2), α ≈ 1.6, being almost negligible as the Galois Field length increases.…”
Section: A Random Linear Network Coding -(Rlnc)mentioning
confidence: 92%
See 2 more Smart Citations
“…where α is a constant value, which does not depend on the generation size, when k 1, but it varies with the Galois Field, as it is derived in [12]. For the binary case, GF (2), α ≈ 1.6, being almost negligible as the Galois Field length increases.…”
Section: A Random Linear Network Coding -(Rlnc)mentioning
confidence: 92%
“…In [12] we introduced a semi-analytical model that accurately captures the performance of Sparse Network Coding (SNC). It is based on an Absorbing Markov Process, S, where the states are defined by the decoding matrix rank, i.e number of useful packets received, and the non-zero columns at such matrix (i.e.…”
Section: B Decoding Complexitymentioning
confidence: 99%
See 1 more Smart Citation
“…For example, Guo et al 24 built the reliability model using Markov chains and the recursive method to evaluate modular multilevel converters under different redundancy schemes. Garrido et al 25 presented a semi-analytical model by exploiting an absorbing Markov process. Meanwhile, there are many different methods in the reliability region, such as subset- simulation-based reliability analysis approach, 26 evidence-based reliability optimization method, 27 and fatigue accumulation model based on probabilistic framework.…”
Section: Overview Of Markov Processmentioning
confidence: 99%
“…donde α es una constante que no depende del tamaño de la generación pero sí del cuerpo de Galois elegido, como se demuestra en [14] y, que en el caso binario (GF (2)), α ≈ 1,6, para ir aproximándose a cero a medida que el tamaño del cuerpo crece. Hay que destacar que bajo esquemas de codificación RLNC no es importante qué paquetes han sido recibidos, sino recibir suficientes paquetes de información para decodificar.…”
Section: A Random Linear Coding (Rlnc)unclassified