1928
DOI: 10.1002/j.1538-7305.1928.tb01236.x
|View full text |Cite
|
Sign up to set email alerts
|

Transmission of Information1

Abstract: Synopsis: A quantitative measure of “information” is developed which is based on physical as contrasted with psychological considerations. How the rate of transmission of this information over a system is limited by the distortion resulting from storage of energy is discussed from the transient viewpoint. The relation between the transient and steady state viewpoints is reviewed. It is shown that when the storage of energy is used to restrict the steady state transmission to a limited range of frequencies the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
644
0
24

Year Published

1992
1992
2014
2014

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 1,549 publications
(704 citation statements)
references
References 0 publications
1
644
0
24
Order By: Relevance
“…Historically, Shannon entropy [2] is the measure of information theoretic cryptography. On the other hand, it is also important to evaluate the cardinality of a set in which a random variable takes values, i.e., Hartley entropy [3]. Furthermore, minentropy [4] is also considered to be an important quantity in guessing the secret in the context of cryptography.…”
Section: Motivation and Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Historically, Shannon entropy [2] is the measure of information theoretic cryptography. On the other hand, it is also important to evaluate the cardinality of a set in which a random variable takes values, i.e., Hartley entropy [3]. Furthermore, minentropy [4] is also considered to be an important quantity in guessing the secret in the context of cryptography.…”
Section: Motivation and Related Workmentioning
confidence: 99%
“…Furthermore, Dodis [6] recently showed that the similar property also holds with respect to min-entropy. Namely, he showed the bound on secret-keys, R ∞ (K) ≥ R ∞ (M ), for symmetric-key encryption with perfect secrecy 3 . Also, Alimomeni and Safavi-Naini [8] introduced the guessing secrecy, formalized by R ∞ (M ) = R ∞ (M |C), and under which they derived the bound R ∞ (K) ≥ R ∞ (M ), where R ∞ (·) and R ∞ (·|·) are the min-entropy and the conditional min-entropy, respectively.…”
Section: Motivation and Related Workmentioning
confidence: 99%
“…The measure of nonspecificity can be derived from Hartley information [25], in contrast to some evaluation measures for learning probabilistic networks, which are based on Shannon information [45]. In order to arrive at an efficient algorithm, an approximation for this loss of specificity is derived, which can be computed locally on the maximal cliques of the network.…”
Section: Learning Possibilistic Network From Datamentioning
confidence: 99%
“…and can be justified as a generalization of Hartley information [10] to the possibilistic setting [13]. nsp(π) reflects the expected amount of information (measured in bits) that has to be added in order to identify the actual value within the set [π] α of alternatives, assuming a uniform distribution on the set [0, sup(π)] of possibilistic confidence levels α [9].…”
Section: Measures For Learning Possibilistic Networkmentioning
confidence: 99%