1982
DOI: 10.1109/proc.1982.12425
|View full text |Cite
|
Sign up to set email alerts
|

On the rationale of maximum-entropy methods

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

3
785
0
9

Year Published

1996
1996
2016
2016

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 1,421 publications
(829 citation statements)
references
References 8 publications
3
785
0
9
Order By: Relevance
“…Leveraging on a celebrated information-theoretic rationale by Jaynes [43], the Shannon entropy of an apparent user profile, modeled as a PMF, may be regarded as a measure of privacy, or more accurately, anonymity. The leading idea is that the method of types [44] from information theory establishes an approximate monotonic relationship between the likelihood of a PMF in a stochastic system and its entropy.…”
Section: Privacy Metric Of Online Activitymentioning
confidence: 99%
“…Leveraging on a celebrated information-theoretic rationale by Jaynes [43], the Shannon entropy of an apparent user profile, modeled as a PMF, may be regarded as a measure of privacy, or more accurately, anonymity. The leading idea is that the method of types [44] from information theory establishes an approximate monotonic relationship between the likelihood of a PMF in a stochastic system and its entropy.…”
Section: Privacy Metric Of Online Activitymentioning
confidence: 99%
“…Thus, we need to concentrate also on the cooperation of the factor-order. [15] C. Measurement of Self-organization Level If the self-organization level of a system considers more than two levels, then of higher levels will be recursively determined by the extended entropy of lower levels. However, 's most probably will be very different on each level.…”
Section: ) Extended Entropy Based On Ordermentioning
confidence: 99%
“…It has a value of zero for two identical distributions and it increases with the increasing dissimilarity between two distributions [48,49]. Cross entropy is asymmetric CE(P 1 ,P 2 )≠CE(P 2 ,P 1 ).…”
Section: Cross Entropy Of Accent Modelsmentioning
confidence: 99%