2015
DOI: 10.3390/e17074644
|View full text |Cite
|
Sign up to set email alerts
|

Quantifying Redundant Information in Predicting a Target Random Variable

Abstract: Abstract:We consider the problem of defining a measure of redundant information that quantifies how much common information two or more random variables specify about a target random variable. We discussed desired properties of such a measure, and propose new measures with some desirable properties.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
36
0

Year Published

2017
2017
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 26 publications
(38 citation statements)
references
References 17 publications
1
36
0
Order By: Relevance
“…Griffith et al [13] suggest an alternative measure motivated by zero-error information, which again formulates an optimisation problem (here maximisation of mutual information) over a family of distributions (here distributions Q which are a function of each predictor so that H(Q|X i ) = 0). Griffith and Ho [16] extend this approach by modifying the optimisation constraint to be H(Q|X i ) = H(Q|X i , Y).…”
Section: Other Redundancy Measuresmentioning
confidence: 99%
See 1 more Smart Citation
“…Griffith et al [13] suggest an alternative measure motivated by zero-error information, which again formulates an optimisation problem (here maximisation of mutual information) over a family of distributions (here distributions Q which are a function of each predictor so that H(Q|X i ) = 0). Griffith and Ho [16] extend this approach by modifying the optimisation constraint to be H(Q|X i ) = H(Q|X i , Y).…”
Section: Other Redundancy Measuresmentioning
confidence: 99%
“…It can therefore overstate the amount of redundancy in a particular set of variables. Several studies have noted this point and suggested alternative approaches [10][11][12][13][14][15][16].…”
Section: Introductionmentioning
confidence: 99%
“…To do that, we need to specify one of the four variables in the system, typically by providing an expression to calculate either R or S. There are a number of proposals in the literature [28][29][30][31], but at the time of writing there is no consensus on any one candidate.…”
Section: Non-negative Decomposition Of Multivariate Informationmentioning
confidence: 99%
“…From equations , it is evident that an additional formula for any single S , U , or R component is required for the computation of all others, since the mutual information values are directly computable from the pdf s. To this end, various methods of computing U [ Bertschinger et al ., ], S [ Griffith and Koch , ; Olbrich et al ., ], and R [ Williams and Beer , ; Harder et al ., ; Griffith and Ho , ] components have been proposed, although there is no universal agreement on the appropriate method. The most common approach is to first compute R , and several proposed measures have been shown to reduce to simply the minimum shared information between the target X tar and either source as follows [ Barrett , ]: RMMI=min[I(Xs1;Xtar),I(Xs2;Xtar)] where MMI denotes “minimum mutual information.” In a cardiovascular application, Faes et al .…”
Section: Information Partitioning Into Synergistic Unique and Redunmentioning
confidence: 99%
“…[] used this metric to show that heart period, arterial pressure, and respiration flow are intertwined elements that control heart rate together, and that there is higher R between subsystems during fast‐paced breathing. However, this formulation for R actually represents a maximum bound for redundancy, since it assumes that all information provided by the weaker source is redundant [ Barrett , ; Griffith and Ho , ]. The use of RMMI as a redundancy metric greatly decreases the number of ways in which information can be partitioned into synergistic, unique, and redundant components.…”
Section: Information Partitioning Into Synergistic Unique and Redunmentioning
confidence: 99%