2007
DOI: 10.1109/tit.2007.909092
|View full text |Cite
|
Sign up to set email alerts
|

Joint Source–Channel Coding Error Exponent for Discrete Communication Systems With Markovian Memory

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
19
0

Year Published

2007
2007
2020
2020

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 18 publications
(19 citation statements)
references
References 17 publications
0
19
0
Order By: Relevance
“…(107) Remark 6. Many existing papers [7], [11], [8] discussed the separation scheme, and they focused on the value P s (e s,k,A , d s,A,k ) * P c (e c,A,n , d c,n,A ). However, they did not give a rigorous derivation of this value.…”
Section: A Formulation For Separation Codingmentioning
confidence: 99%
See 2 more Smart Citations
“…(107) Remark 6. Many existing papers [7], [11], [8] discussed the separation scheme, and they focused on the value P s (e s,k,A , d s,A,k ) * P c (e c,A,n , d c,n,A ). However, they did not give a rigorous derivation of this value.…”
Section: A Formulation For Separation Codingmentioning
confidence: 99%
“…To resolve this problem, we often consider the channel coding with the message subject to the non-uniform distribution. Such a problem is called source-channel joint coding and has been actively studied by several researchers [12], [10], [11], [6], [9], [8]. As a simple case, we often assume that the message is subject to the independent and identical distribution.…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Since related work concerning the finite-length analysis is reviewed in Section 1.1 , we only review work related to the asymptotic analysis here. Some studies on Markov chains for the large deviation regime have been reported [ 40 , 41 , 42 ]. The derivation in [ 40 ] used the Markov-type method.…”
Section: Introductionmentioning
confidence: 99%
“…A drawback of this method is that it involves a term that stems from the number of types, which does not affect the asymptotic analysis, but does hurt the finite-length analysis. Our achievability is derived by following a similar approach as in [ 41 , 42 ], i.e., the Perron–Frobenius theorem, but our derivation separates the single-shot part and the evaluation of the Rényi entropy, and thus is more transparent. Furthermore, the converse part of [ 41 , 42 ] is based on the Shannon–McMillan–Breiman limiting theorem and does not yield finite-length bounds.…”
Section: Introductionmentioning
confidence: 99%