2020
DOI: 10.3390/e22020218
|View full text |Cite
|
Sign up to set email alerts
|

An Information-Theoretic Measure for Balance Assessment in Comparative Clinical Studies

Abstract: Limitations of statistics currently used to assess balance in observation samples include their insensitivity to shape discrepancies and their dependence upon sample size. The Jensen-Shannon divergence (JSD) is an alternative approach to quantifying the lack of balance among treatment groups that does not have these limitations. The JSD is an information-theoretic statistic derived from relative entropy, with three specific advantages relative to using standardized difference scores. First, it is applicable to… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(6 citation statements)
references
References 8 publications
0
6
0
Order By: Relevance
“…To further examine the deleterious effect of the identified missense variants on CDH23, and ARSB, several in silico tools, namely, Sorting Intolerant from Tolerant (SIFT) (17), Polymorphism Phenotyping v2 (PolyPhen-2.0) (18), Fathmm (19), and Mutation Taster (20), were used. Multiple sequence alignment (MSA) was performed to determine the conservation of amino acids at the missense position (p. Arg159Cys) in ARSB and (p.Arg1746Gln) in CDH23, and Jensen-Shannon Divergence (JSD) scores were also calculated (21). Amino acid sequences of CDH23, and ARSB protein from Homo sapiens (human), Mus musculus (house mouse), Rattus norvegicus (Norway rat), Callorhinchus milii (elephant shark), Bos taurus (cattle), Danio rerio (zebrafish), Felis catus (domestic cat), Pan troglodytes (chimpanzee), and Vulpes Vulpes (red fox) were retrieved from National Center for Biotechnology Information (NCBI) RefSeq and aligned using Clustal Omega (22).…”
Section: In Silico Analysismentioning
confidence: 99%
“…To further examine the deleterious effect of the identified missense variants on CDH23, and ARSB, several in silico tools, namely, Sorting Intolerant from Tolerant (SIFT) (17), Polymorphism Phenotyping v2 (PolyPhen-2.0) (18), Fathmm (19), and Mutation Taster (20), were used. Multiple sequence alignment (MSA) was performed to determine the conservation of amino acids at the missense position (p. Arg159Cys) in ARSB and (p.Arg1746Gln) in CDH23, and Jensen-Shannon Divergence (JSD) scores were also calculated (21). Amino acid sequences of CDH23, and ARSB protein from Homo sapiens (human), Mus musculus (house mouse), Rattus norvegicus (Norway rat), Callorhinchus milii (elephant shark), Bos taurus (cattle), Danio rerio (zebrafish), Felis catus (domestic cat), Pan troglodytes (chimpanzee), and Vulpes Vulpes (red fox) were retrieved from National Center for Biotechnology Information (NCBI) RefSeq and aligned using Clustal Omega (22).…”
Section: In Silico Analysismentioning
confidence: 99%
“…Although this definition is arbitrary, it precisely defines a tight distribution with minimal variation in κ gi that appropriately scales with the magnitude of curvature in a partition. For distributions f 1 (X) and , where is the common distribution and is the relative entropy, which captures the divergence from the given probability distribution f ( X ) t the reference distribution g ( X ) [77, 78]. The discrete form is calculated.…”
Section: Demographic Informationmentioning
confidence: 99%
“…1 g(x) ln g(x) f (x) dx is the relative entropy, which captures the divergence from the given probability distribution f (X) to the reference distribution g(X) [77,78]. The discrete form D(g(X)f (X)) = P x2X g(x) ln g(x) f (x) is calculated.…”
Section: A5 Jensen-shannon Divergence Of Within-partition Gaussian Cu...mentioning
confidence: 99%
“…where I represent Kullback-Leibler (KL), information divergence against independence, and is a measure of the likely dependence between the component variables [38], [39].The joint distribution of the random variables to the product of their marginal distribution is the major contrast base on the hypothesis that they are distributed independently to the level that the joint distribution significantly varies from the distribution of the random variable base on the theory of dependency. Hence, this is known as measure of multicollinearity or interdependence between variables.…”
Section: Information Measure Of Dependence In High-dimensionsmentioning
confidence: 99%