2014
DOI: 10.3390/e16074132
|View full text |Cite
|
Sign up to set email alerts
|

Network Decomposition and Complexity Measures: An Information Geometrical Approach

Abstract: Abstract:We consider the graph representation of the stochastic model with n binary variables, and develop an information theoretical framework to measure the degree of statistical association existing between subsystems as well as the ones represented by each edge of the graph representation. Besides, we consider the novel measures of complexity with respect to the system decompositionability, by introducing the geometric product of Kullback-Leibler (KL-) divergence. The novel complexity measures satisfy the … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
16
0

Year Published

2015
2015
2018
2018

Publication Types

Select...
4
1
1

Relationship

4
2

Authors

Journals

citations
Cited by 7 publications
(16 citation statements)
references
References 35 publications
(40 reference statements)
0
16
0
Order By: Relevance
“…For simplicity, we consider the integration of two databases X N and X M , with size N and M ∈ N, X N := {µ i (x)|x ∈ X, i = 1, · · · , N}, X M := {µ i (x)|x ∈ X, i = 1, · · · , M}, respectively. A joint distribution between subsets of X N and X M needs to be determined with respect to common parameters in order to obtain an integrated database including the calculation of up to (N + M)-th order of commonality, such as order-wise correlations [32]. Exhaustive computing follows the argument in Section 4.2, giving the extension of Theorem 6: Theorem 7.…”
Section: Big Data Integrationmentioning
confidence: 99%
See 4 more Smart Citations
“…For simplicity, we consider the integration of two databases X N and X M , with size N and M ∈ N, X N := {µ i (x)|x ∈ X, i = 1, · · · , N}, X M := {µ i (x)|x ∈ X, i = 1, · · · , M}, respectively. A joint distribution between subsets of X N and X M needs to be determined with respect to common parameters in order to obtain an integrated database including the calculation of up to (N + M)-th order of commonality, such as order-wise correlations [32]. Exhaustive computing follows the argument in Section 4.2, giving the extension of Theorem 6: Theorem 7.…”
Section: Big Data Integrationmentioning
confidence: 99%
“…TDC represents statistical dependencies between two complexity measures in response to a given inter-subjective objective measurement. While significant matching between two commonality orders assures the reproducibility based on the coincidence of observation with these measures, non-significance can also be used to quantify complementarity of different evaluations [32].…”
Section: Theorem 4 Statistical Test On the Degree Of Coincidence (Tdmentioning
confidence: 99%
See 3 more Smart Citations