We study error exponents for source coding with side information. Both achievable exponents and converse bounds are obtained for the following two cases: lossless source coding with coded information (SCCSI) and lossy source coding with full side information (Wyner-Ziv). These results recover and extend several existing results on source-coding error exponents and are tight in some circumstances. Our bounds have a natural interpretation as a two-player game between nature and the code designer, with nature seeking to minimize the exponent and the code designer seeking to maximize it. In the Wyner-Ziv problem our analysis exposes a tension in the choice of test channel with the optimal test channel balancing two competing error events. The Gaussian and binary-erasure cases are examined in detail.
I. INTRODUCTIONIn a typical lossy data compression problem a source is to be compressed by an encoder at a prescribed rate so that a decoder may reproduce the source to within some desired fidelity (distortion). Sometimes present, in addition to the data to be compressed, is some correlated information that can be utilized by a second encoder, that is able to send a separate message to the decoder. We refer to this kind of problem as source coding with side information (SCSI). The set-up is depicted in Fig. 1, where a source X is compressed by encoder one to a rate R 1 with the decoder having access to encoded side information Y , compressed at rate R 2 by encoder two, as well as the compressed version of X from the first encoder.The SCSI scenario arises in a variety of applications. For example, in video applications [1] X can represent a current frame, and Y a separate correlated frame sent from a second encoder. Y can even represent the frame(s) preceding the current frame X in the stream: while the previous frames are certainly available to the encoder, the encoder's coding scheme can be simplified by not making use of this information and leaving the decoder to exploit the interframe dependence. A second example can be found in communication in networks with relays [2]. A source sends a message X to a sink in a network containing a relay. One mode of operation for the relay is "compress and forward", i.e. for the relay to send a compressed version of its observation, Y , of the source-sink message to the sink. This compressed message can be used by the sink to further aid its decoding. SCSI appears in applications even beyond communication, for example (with minor changes) it has been proposed as a model for rate-constrained pattern recognition [3].For the lossless problem with coded side information (SCCSI) 1 , and the lossy problem with full side information (Wyner-Ziv), the "rate region" problem, i.e. determining the rates required to meet a given average distortion constraint, is solved. In this paper, we study these two problems from an error-exponent standpoint. Our motivation for doing so is three-fold: