2021
DOI: 10.1109/tcomm.2021.3097142
|View full text |Cite
|
Sign up to set email alerts
|

Forward-Aware Information Bottleneck-Based Vector Quantization: Multiterminal Extensions for Parallel and Successive Retrieval

Abstract: Consider the following setup: Through a joint design, multiple observations of a remote data source shall be locally compressed before getting transmitted via several error-prone, rate-limited forward links to a (distant) processing unit. For addressing this specific instance of multiterminal Joint Source-Channel Coding problem, in this article, the foundational principle of the Information Bottleneck method is fully extended to obtain purely statistical design approaches, enjoying the Mutual Information as th… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4

Relationship

1
3

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 50 publications
(84 reference statements)
0
3
0
Order By: Relevance
“…Basically, we implement joint source-channel coding of messages conveying the semantic RV, but not differentiating between Levels A and B. We formulate the semantic communication design either as an Information Maximization or as an Information Bottleneck (IB) optimization problem [ 29 , 30 , 31 ]. Although the approach pursued here again leads to an IB problem as in [ 26 ], our article introduces a new classification and perspective of semantic communication and different ML-based solution approaches.…”
Section: Main Contributionsmentioning
confidence: 99%
See 2 more Smart Citations
“…Basically, we implement joint source-channel coding of messages conveying the semantic RV, but not differentiating between Levels A and B. We formulate the semantic communication design either as an Information Maximization or as an Information Bottleneck (IB) optimization problem [ 29 , 30 , 31 ]. Although the approach pursued here again leads to an IB problem as in [ 26 ], our article introduces a new classification and perspective of semantic communication and different ML-based solution approaches.…”
Section: Main Contributionsmentioning
confidence: 99%
“…Therefore, we can formulate an optimization problem where we like to maximize the relevant information subject to the constraint to limit the compression rate to a maximum information rate : Problem ( 15 ) is an important variation of the InfoMax principle and called the Information Bottleneck (IB) problem [ 10 , 29 , 40 , 41 ]. The IB method introduced by Tishby et al [ 29 ] has been the subject of intensive research for years and has proven to be a suitable mathematical/information-theoretical framework for solving numerous problems—as well as in wireless communications [ 30 , 31 , 42 , 43 ]. Note that we aim for an encoder that compresses into a compact representation for discrete RVs by clustering and for continuous RVs by dimensionality reduction.…”
Section: A Framework For Semanticsmentioning
confidence: 99%
See 1 more Smart Citation
“…The applications studied in the communications context lead from the design of channel output quantizers [ 9 , 10 , 11 ] over the decoding of low-density parity-check (LDPC) codes [ 12 , 13 , 14 , 15 , 16 , 17 , 18 , 19 , 20 , 21 , 22 , 23 , 24 , 25 , 26 , 27 , 28 , 29 , 30 , 31 ] and polar codes [ 32 , 33 , 34 ] to entire receiver chains that include channel estimation and detection [ 35 , 36 , 37 , 38 ]. Moreover, the information bottleneck method has been applied in joint source–channel coding, forwarding and relaying applications [ 39 , 40 , 41 , 42 , 43 , 44 , 45 , 46 , 47 , 48 ] and in distributed sensor networks [ 49 , 50 , 51 , 52 , 53 ] successfully. Related works with a focus on inference with the distributiveness of data among multiple nodes and network learning aspects include [ 54 , 55 , 56 ].…”
Section: Introductionmentioning
confidence: 99%