2017
DOI: 10.3390/e19070328
|View full text |Cite
|
Sign up to set email alerts
|

On Extractable Shared Information

Abstract: Abstract:We consider the problem of quantifying the information shared by a pair of random variables X 1 , X 2 about another variable S. We propose a new measure of shared information, called extractable shared information, that is left monotonic; that is, the information shared about S is bounded from below by the information shared about f (S) for any function f . We show that our measure leads to a new nonnegative decomposition of the mutual information I(S; X 1 X 2 ) into shared, complementary and unique c… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
30
0

Year Published

2017
2017
2022
2022

Publication Types

Select...
8
1

Relationship

2
7

Authors

Journals

citations
Cited by 27 publications
(33 citation statements)
references
References 13 publications
1
30
0
Order By: Relevance
“…We hope that the examples we present here are useful in developing intuition on how information can be shared among random variables and how it behaves when applying a deterministic function, such as a coarse-graining. Further implications of our examples on information decompositions are discussed in [8]. In the converse direction, information decomposition measures (such as measures of unique information) can be used to study the Blackwell order and deviations from the Blackwell order.…”
Section: Introductionmentioning
confidence: 92%
“…We hope that the examples we present here are useful in developing intuition on how information can be shared among random variables and how it behaves when applying a deterministic function, such as a coarse-graining. Further implications of our examples on information decompositions are discussed in [8]. In the converse direction, information decomposition measures (such as measures of unique information) can be used to study the Blackwell order and deviations from the Blackwell order.…”
Section: Introductionmentioning
confidence: 92%
“…This latter approach is considered in detail by the contribution of Rauh et al [50] in this special issue.…”
Section: Measured Information Modification Versus the Capacity Of A Mmentioning
confidence: 99%
“…Further, the specific redundancy measure proposed in Ref. [10] has been questioned as it can lead to unintuitive results [22], and thus many attempts have been devoted to finding alternative measures [22,[25][26][27][28][29][30] compatibly with an extended number of axioms, such as the identity axiom proposed in [22]. Other work has studied in more detail the lattice structure that underpins the PID, indicating the duality between information gain and information loss lattices [12].…”
Section: Preliminaries and State Of The Artmentioning
confidence: 99%