2019 IEEE International Symposium on Information Theory (ISIT) 2019
DOI: 10.1109/isit.2019.8849282
|View full text |Cite
|
Sign up to set email alerts
|

A Local Perspective on the Edge Removal Problem

Abstract: The edge removal problem studies the loss in network coding rates that results when a network communication edge is removed from a given network. It is known, for example, that in networks restricted to linear coding schemes and networks restricted to Abelian group codes, removing an edge e * with capacity Re * reduces the achievable rate on each source by no more than Re * . In this work, we seek to uncover larger families of encoding functions for which the edge removal statement holds. We take a local persp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2019
2019
2020
2020

Publication Types

Select...
2

Relationship

2
0

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 24 publications
(82 reference statements)
0
2
0
Order By: Relevance
“…The opposite direction representing CWL codes as group codes, was presented (under a slightly different set of definitions) in [15]. The following theorem is given for completeness, and is proven in the Appendix.…”
Section: Group Network Codementioning
confidence: 99%
See 1 more Smart Citation
“…The opposite direction representing CWL codes as group codes, was presented (under a slightly different set of definitions) in [15]. The following theorem is given for completeness, and is proven in the Appendix.…”
Section: Group Network Codementioning
confidence: 99%
“…Linear codes are shown to be CWL by choosing the corresponding groups to again be subspaces of the source vector space. Further, the network codes defined by (global) CWL functions are group codes [15], [16]. In this work we address the complementary question asking whether group codes can be represented operationally through CWL functions.…”
Section: Introductionmentioning
confidence: 99%