2021
DOI: 10.48550/arxiv.2106.05379
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Threshold-free estimation of entropy from a Pearson matrix

H. Felippe,
A. Viol,
D. B. de Araujo
et al.

Abstract: We address the general problem of how to estimate an entropy given a Pearson correlation matrix. Most methods currently in use inject a degree of arbitrariness due to the thresholding of correlations.Here we propose an entirely objective method of entropy estimation that requires no thresholding. Let R be an N × N Pearson correlation matrix. We define the matrix ρ = R/N and prove that ρ satisfies all the properties required of the density operator. Hence, the von Neumann entropy S = − tr (ρ log ρ) can be direc… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 55 publications
(134 reference statements)
0
1
0
Order By: Relevance
“…where N is the number of ROIs in the correlation matrix (N=33 or 7) and 𝜆 𝑗 are the eigen values of the 𝜌 matrix (Felippe et al, 2021). The entropy takes minimum value when all the correlation coefficients are unity and reaches the maximum value (log N) if the all the correlations are zero.…”
Section: Whole Brain Variability Properties Of Structural Covariance ...mentioning
confidence: 99%
“…where N is the number of ROIs in the correlation matrix (N=33 or 7) and 𝜆 𝑗 are the eigen values of the 𝜌 matrix (Felippe et al, 2021). The entropy takes minimum value when all the correlation coefficients are unity and reaches the maximum value (log N) if the all the correlations are zero.…”
Section: Whole Brain Variability Properties Of Structural Covariance ...mentioning
confidence: 99%