2014 Information Theory and Applications Workshop (ITA) 2014
DOI: 10.1109/ita.2014.6804255
|View full text |Cite
|
Sign up to set email alerts
|

Non-asymptotic and asymptotic analyses on Markov chains in several problems

Abstract: Abstract-In this paper, we derive non-asymptotic achievability and converse bounds on the source coding with side-information and the random number generation with side-information. Our bounds are efficiently computable in the sense that the computational complexity does not depend on the block length. We also characterize the asymptotic behaviors of the large deviation regime and the moderate deviation regime by using our bounds, which implies that our bounds are asymptotically tight in those regimes. We also… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
31
0

Year Published

2014
2014
2019
2019

Publication Types

Select...
4
2

Relationship

3
3

Authors

Journals

citations
Cited by 14 publications
(32 citation statements)
references
References 28 publications
1
31
0
Order By: Relevance
“…Hence, in order to treat the relative entropy D(W V ) and the relative Rényi entropy D 1+s (W V ) in a unified way, we adopt the definition (3.2) for the relative entropy D(W V ) instead of the final term of (5.5). Our definition clarifies the relation between the relative entropy D(W V ) and the relative Rényi entropy D 1+s (W V ), which is helpful when we apply these quantities to simple hypothesis testing [17], random number generation, data compression, and channel coding [18] in Markov chain.…”
Section: Resultsmentioning
confidence: 99%
See 4 more Smart Citations
“…Hence, in order to treat the relative entropy D(W V ) and the relative Rényi entropy D 1+s (W V ) in a unified way, we adopt the definition (3.2) for the relative entropy D(W V ) instead of the final term of (5.5). Our definition clarifies the relation between the relative entropy D(W V ) and the relative Rényi entropy D 1+s (W V ), which is helpful when we apply these quantities to simple hypothesis testing [17], random number generation, data compression, and channel coding [18] in Markov chain.…”
Section: Resultsmentioning
confidence: 99%
“…However, they did not clearly consider the relation with the other conditions in Lemma 4.1. In fact, these equivalence relations are essential for the condition of a generator of an exponential family and also for applications to finite-length evaluations of the tail probability, the error probability in simple hypothesis testing [17], source coding, channel coding, and random number generation [18] in Markov chain. Now, we proceed to the definition of an exponential family for transition matrices.…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations