2016
DOI: 10.1103/physreve.94.040301
|View full text |Cite
|
Sign up to set email alerts
|

Pairwise network information and nonlinear correlations

Abstract: Reconstructing the structural connectivity between interacting units from observed activity is a challenge across many different disciplines. The fundamental first step is to establish whether or to what extent the interactions between the units can be considered pairwise and, thus, can be modeled as an interaction network with simple links corresponding to pairwise interactions. In principle this can be determined by comparing the maximum entropy given the bivariate probability distributions to the true joint… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

4
24
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 7 publications
(28 citation statements)
references
References 55 publications
4
24
0
Order By: Relevance
“…The absence of techniques to efficiently calculate the maximum entropy in these cases is conspicuous; conditioning on the univariate entropies alone is equivalent to assuming the variables are independent, a widely used result, but a generalisation to a wider array of information-theoretic terms has not been forthcoming to the best of our knowledge. In 30 we introduced a method to address this issue using the set-theoretic formulation of information theory. Here, we extend our maximum entropy technique to continuous random variables and vastly expand the types of information-theoretic quantities one can condition on, compared to the univariate entropies and bivariate mutual informations discussed in ref.…”
Section: Methodsmentioning
confidence: 99%
See 4 more Smart Citations
“…The absence of techniques to efficiently calculate the maximum entropy in these cases is conspicuous; conditioning on the univariate entropies alone is equivalent to assuming the variables are independent, a widely used result, but a generalisation to a wider array of information-theoretic terms has not been forthcoming to the best of our knowledge. In 30 we introduced a method to address this issue using the set-theoretic formulation of information theory. Here, we extend our maximum entropy technique to continuous random variables and vastly expand the types of information-theoretic quantities one can condition on, compared to the univariate entropies and bivariate mutual informations discussed in ref.…”
Section: Methodsmentioning
confidence: 99%
“…Here, we extend our maximum entropy technique to continuous random variables and vastly expand the types of information-theoretic quantities one can condition on, compared to the univariate entropies and bivariate mutual informations discussed in ref. 30 .…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations