2021
DOI: 10.1007/978-3-030-80209-7_50
|View full text |Cite
|
Sign up to set email alerts
|

A Primer on Alpha-Information Theory with Application to Leakage in Secrecy Systems

Abstract: We give an informative review of the notions of Rényi's αentropy and α-divergence, Arimoto's conditional α-entropy, and Sibson's α-information, with emphasis on the various relations between them. All these generalize Shannon's classical information measures corresponding to α = 1. We present results on data processing inequalities and provide some new generalizations of the classical Fano's inequality for any α > 0. This enables one to α-information as a information theoretic metric of leakage in secrecy syst… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 8 publications
(1 citation statement)
references
References 13 publications
(8 reference statements)
0
0
0
Order By: Relevance
“…(additional knowledge reduces randomness) and then noting that p(x|y, z) = p(x|y) by the Markov property-see, e.g., [7,18] for H α and [17] for G. Conversely, (30)-( 33) can be re-obtained from (34)-(37) as the particular case Z = 0 (any deterministic variable representing zero information).…”
Section: F-concavity: Knowledge Reduces Randomness and Data Processingmentioning
confidence: 99%
“…(additional knowledge reduces randomness) and then noting that p(x|y, z) = p(x|y) by the Markov property-see, e.g., [7,18] for H α and [17] for G. Conversely, (30)-( 33) can be re-obtained from (34)-(37) as the particular case Z = 0 (any deterministic variable representing zero information).…”
Section: F-concavity: Knowledge Reduces Randomness and Data Processingmentioning
confidence: 99%