2021 IEEE International Symposium on Information Theory (ISIT) 2021
DOI: 10.1109/isit45174.2021.9517720
|View full text |Cite
|
Sign up to set email alerts
|

Scalable Vector Gaussian Information Bottleneck

Abstract: In the context of statistical learning, the Information Bottleneck method seeks a right balance between accuracy and generalization capability through a suitable tradeoff between compression complexity, measured by minimum description length, and distortion evaluated under logarithmic loss measure.In this paper, we study a variation of the problem, called scalable information bottleneck, in which the encoder outputs multiple descriptions of the observation with increasingly richer features. The model, which is… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 20 publications
0
2
0
Order By: Relevance
“…More general frameworks could allow for variations in either the source of information (as in the case in temporal series) or the target variable (as is the case in transfer learning). Frameworks for both these kinds of extensions have already been proposed [ 46 , 78 ], and it would be interesting to study if, in these cases as well, the specific nature of the IB problem imprints the informationally optimal limits of several-stage processing.…”
Section: Limitations and Future Workmentioning
confidence: 99%
“…More general frameworks could allow for variations in either the source of information (as in the case in temporal series) or the target variable (as is the case in transfer learning). Frameworks for both these kinds of extensions have already been proposed [ 46 , 78 ], and it would be interesting to study if, in these cases as well, the specific nature of the IB problem imprints the informationally optimal limits of several-stage processing.…”
Section: Limitations and Future Workmentioning
confidence: 99%
“…In [19], the authors propose a variation of the original Information Bottleneck problem named Scalable Information Bottleneck, where multiple compressed representations, with increasingly richer features, are considered. The work in [20] proposes a solution for a distributed implementation of the IB problem, that is suitable in both the discrete and Gaussian case.…”
mentioning
confidence: 99%