1956
DOI: 10.1109/tit.1956.1056823
|View full text |Cite
|
Sign up to set email alerts
|

On the Shannon theory of information transmission in the case of continuous signals

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

9
242
0
2

Year Published

1996
1996
2013
2013

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 348 publications
(253 citation statements)
references
References 0 publications
9
242
0
2
Order By: Relevance
“…Their mutual information, M (x, y) = p(x, y) log[p(x, y)/p(x)p(y)]dxdy, can be defined as the upper bound of the mutual information between all discretizations of x and y (Kolmogorov, 1956). Behind this definition lies the crucial fact that when refining the partitions of the sample space used to discretize x and y, the discrete mutual information must increase.…”
Section: B2 a New Information Measure For Continuous Random Variablesmentioning
confidence: 99%
“…Their mutual information, M (x, y) = p(x, y) log[p(x, y)/p(x)p(y)]dxdy, can be defined as the upper bound of the mutual information between all discretizations of x and y (Kolmogorov, 1956). Behind this definition lies the crucial fact that when refining the partitions of the sample space used to discretize x and y, the discrete mutual information must increase.…”
Section: B2 a New Information Measure For Continuous Random Variablesmentioning
confidence: 99%
“…Particularly in communications systems, the Fourier series and Fourier transform [13] are tools to generate or recover signals from cardinal series. The aliasing phenomenon in Shannon sampling reconstruction procedure has been discussed in numerous papers [14][15][16]. The presence of aliasing in the generation of signals often is associated to truncated cardinal series [14].…”
Section: Aliasingmentioning
confidence: 99%
“…where on the right side of (4) is the directed information between two sequences of length n defined in (1); and in (5) t 0 = 0 by convention and the mutual information terms between two continuous time processes, conditioned on a third, are well-defined objects, as developed in [16], [17]. The mutual information of two random variables U, V with continuous alphabets, i.e., I(U ; V ) is defined as…”
Section: Definition Of Directed Informationmentioning
confidence: 99%
“…e R → 0 as P (T ) e → 0, the equality in (32) follows from the fact that X ti−1+∆ is a deterministic function of M and Y ti−1 0 , the equality in (33) follows from the assumption that t i − t i−1 < ∆, the equality in (35) follows from (14), and the equality in (36) follows from (16). Hence, we obtained that for every t…”
Section: T T=0mentioning
confidence: 99%