In this paper, we revisit two multi-terminal lossy source coding problems: the lossy source coding problem with side information available at the encoder and one of the two decoders, which we term as the Kaspi problem (Kaspi, 1994), and the multiple description coding problem with one semi-deterministic distortion measure, which we refer to as the Fu-Yeung problem (Fu and Yeung, 2002). For the Kaspi problem, we first present the properties of optimal test channels. Subsequently, we generalize the notion of the distortion-tilted information density for the lossy source coding problem to the Kaspi problem and prove a nonasymptotic converse bound using the properties of optimal test channels and the well-defined distortion-tilted information density. Finally, for discrete memoryless sources, we derive refined asymptotics which includes the second-order, large and moderate deviations asymptotics. In the converse proof of second-order asymptotics, we apply the Berry-Esseen theorem to the derived non-asymptotic converse bound. The achievability proof follows by first proving a type-covering lemma tailored to the Kaspi problem, then properly Taylor expanding the well-defined distortion-tilted information densities and finally applying the Berry-Esseen theorem. We then generalize the methods used in the Kaspi problem to the Fu-Yeung problem. As a result, we obtain the properties of optimal test channels for the minimum sum-rate function, a non-asymptotic converse bound and refined asymptotics for discrete memoryless sources. Since the successive refinement problem is a special case of the Fu-Yeung problem, as a byproduct, we obtain a non-asymptotic converse bound for the successive refinement problem, which is a strict generalization of the non-asymptotic converse bound for successively refinable sources (Zhou, Tan and Motani, 2017).
Index TermsLossy source coding, multiple description coding problem, Non-asymptotic converse bound, Second-order asymptotics, Large Deviations, Moderate DeviationsThe authors are with the .sg. Part of this paper will be presented at Globecom 2017 [1], [2]. i.e., the second-order (cf. [13], [34], [35]), the large deviations (cf. [36], [37]) and the moderate deviations (cf. [38], [39])asymptotics. The converse proof of the second-order asymptotics follows by applying the Berry-Esseen theorem to the derived non-asymptotic converse bound and the achievability part relies on a type-covering lemma tailored to the Kaspi problem, proper uses of the Taylor's expansions, and the properties of the distortion-tilted information density. Since the Kaspi problem is a generalization of the lossy source coding problem and the lossy source coding problem with encoder and decoder side information, our second-order asymptotics for the Kaspi problem is a generalization of [13] and [40] for DMSes. We illustrate this point via numerical examples.We then extend the methods used in the Kaspi problem to the Fu-Yeung problem. First, we present the optimal test channels for the minimum sum-rate function. Subsequently,...