2006 IEEE Information Theory Workshop 2006
DOI: 10.1109/itw.2006.322888
|View full text |Cite
|
Sign up to set email alerts
|

The Gaussian Many-Help-One Distributed Source Coding Problem

Abstract: Jointly Gaussian memoryless sources (y 1 , . . . , y N ) are observed at N distinct terminals. The goal is to efficiently encode the observations in a distributed fashion so as to enable reconstruction of any one of the observations, say y 1 , at the decoder subject to a quadratic fidelity criterion. Our main result is a precise characterization of the rate-distortion region when the covariance matrix of the sources satisfies a "tree-structure" condition. In this situation, a natural analog/digital separation … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
29
0

Year Published

2006
2006
2019
2019

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 22 publications
(29 citation statements)
references
References 9 publications
0
29
0
Order By: Relevance
“…Oohama solved the multi-terminal source coding case for the two [10] and L + 1 [11] Gaussian sources, in which only one source needs to be reconstructed with a mean square error, that is, the other L sources are helpers. More recently, Wagner, Tavildar, and Viswanath characterized the region where both sources [12] or L + 1 sources [13] need to be reconstructed at the decoder with a mean square error criterion.…”
Section: Introductionmentioning
confidence: 99%
“…Oohama solved the multi-terminal source coding case for the two [10] and L + 1 [11] Gaussian sources, in which only one source needs to be reconstructed with a mean square error, that is, the other L sources are helpers. More recently, Wagner, Tavildar, and Viswanath characterized the region where both sources [12] or L + 1 sources [13] need to be reconstructed at the decoder with a mean square error criterion.…”
Section: Introductionmentioning
confidence: 99%
“…This is a strategy for the Gaussian many-help-one source coding problem [ 1 ], allowing for the case where the “receiver being helped” cannot provide side information to the decoder. The problem is also similar in structure to the CEO problem [ 2 ], but most studies on the CEO problem do not model correlated noise across observers.…”
Section: Introductionmentioning
confidence: 99%
“…The problem is also similar in structure to the CEO problem [ 2 ], but most studies on the CEO problem do not model correlated noise across observers. The rate-distortion region of this problem is achieved by random coding for some covariance structures [ 1 ], but is unknown in general. The rate-distortion region for the case of two helpers is known within bounds [ 3 ].…”
Section: Introductionmentioning
confidence: 99%
“…Two scenarios are considered: centralized encoding (see Figure 1 ) and distributed encoding (see Figure 2 ). It is worth noting that the distributed encoding scenario is closely related to the CEO problem, which has been studied extensively [ 1 , 2 , 3 , 4 , 5 , 6 , 7 , 8 , 9 , 10 , 11 , 12 , 13 , 14 , 15 , 16 , 17 , 18 ].…”
Section: Introductionmentioning
confidence: 99%