2018
DOI: 10.1371/journal.pone.0198253
|View full text |Cite
|
Sign up to set email alerts
|

From patterned response dependency to structured covariate dependency: Entropy based categorical-pattern-matching

Abstract: Data generated from a system of interest typically consists of measurements on many covariate features and possibly multiple response features across all subjects in a designated ensemble. Such data is naturally represented by one response-matrix against one covariate-matrix. A matrix lattice is an advantageous platform for simultaneously accommodating heterogeneous data types: continuous, discrete and categorical, and exploring hidden dependency among/between features and subjects. After each feature being in… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
24
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5

Relationship

1
4

Authors

Journals

citations
Cited by 13 publications
(24 citation statements)
references
References 18 publications
0
24
0
Order By: Relevance
“…For a comprehensive description of this step, including a presentation of mutual entropy, we refer the reader to Fushing et al [26]. We note that entropy is a quantitative measure of "disorder", or randomness of a thermodynamic system.…”
Section: Step 2: Identification Of the Groups Of Synergistic Featuresmentioning
confidence: 99%
See 2 more Smart Citations
“…For a comprehensive description of this step, including a presentation of mutual entropy, we refer the reader to Fushing et al [26]. We note that entropy is a quantitative measure of "disorder", or randomness of a thermodynamic system.…”
Section: Step 2: Identification Of the Groups Of Synergistic Featuresmentioning
confidence: 99%
“…The procedure is then repeated over all features and the values given for the gaps are then chosen so that the overall scale of the digital codes over the 69 characteristics of the wines is from 1 to 10. Once the digital code is established, a distance between two features j and k is computed by comparing the clustering of the wines that they produce, using mutual entropy as a distance measure (see Method Section and [26] for how the mutual entropy is computed). Using this distance, the 69 characteristics are then clustered using the DCG method [22,23].…”
Section: California Vs Argentinan Malbec Winesmentioning
confidence: 99%
See 1 more Smart Citation
“…To bring out these synergistic groups of pixels, we apply mutual conditional entropy from Combinatorial Information Theory (see [17]) to measure the degree of synergistic association between two pixels' calcium intensity time series. Basically in order to compute mutual conditional entropy, one possiblygapped histogram (see [14]) is derived for each of the two time series, excluding their ictal periods.…”
Section: What Are Information Contents?mentioning
confidence: 99%
“…To address this issue, we calculate the mutual conditional-entropy among five clustering compositions, each of which respectively corresponds to one 7cluster tree-level of one computed HC-tree, pertaining to five inter-ictal periods. That is, each HC-tree renders one categorical variable with 7 categories in each inter-ictal period (see [17]).…”
Section: Stability Of Spatial-communities Across Inter-ictal Periodsmentioning
confidence: 99%