2015
DOI: 10.1109/tit.2015.2401004
|View full text |Cite
|
Sign up to set email alerts
|

Extrinsic Jensen–Shannon Divergence: Applications to Variable-Length Coding

Abstract: Abstract-This paper considers the problem of variable-length coding over a discrete memoryless channel (DMC) with noiseless feedback. The paper provides a stochastic control view of the problem whose solution is analyzed via a newly proposed symmetrized divergence, termed extrinsic Jensen-Shannon (EJS) divergence. It is shown that strictly positive lower bounds on EJS divergence provide non-asymptotic upper bounds on the expected code length. The paper presents strictly positive lower bounds on EJS divergence,… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
112
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
7
2

Relationship

2
7

Authors

Journals

citations
Cited by 60 publications
(112 citation statements)
references
References 25 publications
0
112
0
Order By: Relevance
“…have been shown to be useful for analyzing adaptive systems [2], [29], [30], [31] with the Bayes' rule dynamics on the posterior distribution. Particularly, the functional average log-likelihood [30] has shown its usefulness in analyzing the behaviour of the posterior in feedback coding systems [28], dynamic spectrum sensing [10], hypothesis testing [32], active learning [32], etc. Here, we review some useful concepts through the context of AoA estimation with sequential beamforming:…”
Section: Appendix B Average Log-likelihood and The Extrinsic Jensen-smentioning
confidence: 99%
See 1 more Smart Citation
“…have been shown to be useful for analyzing adaptive systems [2], [29], [30], [31] with the Bayes' rule dynamics on the posterior distribution. Particularly, the functional average log-likelihood [30] has shown its usefulness in analyzing the behaviour of the posterior in feedback coding systems [28], dynamic spectrum sensing [10], hypothesis testing [32], active learning [32], etc. Here, we review some useful concepts through the context of AoA estimation with sequential beamforming:…”
Section: Appendix B Average Log-likelihood and The Extrinsic Jensen-smentioning
confidence: 99%
“…Fact 2 (Lemma 2 in [30]). The EJS divergence is lower bounded by the Jensen Shannon (JS) divergence :…”
Section: Appendix B Average Log-likelihood and The Extrinsic Jensen-smentioning
confidence: 99%
“…(18). However, we provide an alternative proof that holds for all T ∈ R which uses a different construction of a submartingale (compared to [26], [27]). We also show rigorously that τ is almost surely finite in Lemma 13, which is essential for the proof of Lemma 1 and of Lemmas 2 and 3 to follow.…”
Section: Achievability Proofmentioning
confidence: 99%
“…This is a form of active feedback in the sense that the feedback is telling the transmitter what to transmitter rather than telling the transmitter only whether additional bits from a pre-determined rate-compatible code family. This is a generalization of the ideas of active hypothesis testing [10]. For comparison, we also performed simulations using non-active feedback, in which the additional bits are selected at random.…”
Section: A Creating a Bit For Incremental Transmissionmentioning
confidence: 99%