2003
DOI: 10.1055/s-0038-1634358
|View full text |Cite
|
Sign up to set email alerts
|

Mutual Information as an Index of Diagnostic Test Performance

Abstract: Summary Objectives: This paper demonstrates that diagnostic test performance can be quantified as the average amount of information the test result (R) provides about the disease state (D). Methods: A fundamental concept of information theory, mutual information, is directly applicable to this problem. This statistic quantifies the amount of information that one random variable contains about another random variable. Prior to performing a diagnostic test, R and D are random variables. Hence… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
37
0

Year Published

2013
2013
2022
2022

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 26 publications
(37 citation statements)
references
References 17 publications
(23 reference statements)
0
37
0
Order By: Relevance
“…In the life sciences, Shannon entropy has been used to measure cellular diversity (12,13) and phylogenetic variation (14), and to model molecular interactions (15). This concept has previously been applied to singleanalyte laboratory testing by Rudolph (16)(17)(18) and, more recently, by Benish (19)(20)(21)(22)(23) and Vollmer (24). However, their approaches have not been widely adopted or disseminated and, in particular, have not been applied to NGS.…”
mentioning
confidence: 98%
“…In the life sciences, Shannon entropy has been used to measure cellular diversity (12,13) and phylogenetic variation (14), and to model molecular interactions (15). This concept has previously been applied to singleanalyte laboratory testing by Rudolph (16)(17)(18) and, more recently, by Benish (19)(20)(21)(22)(23) and Vollmer (24). However, their approaches have not been widely adopted or disseminated and, in particular, have not been applied to NGS.…”
mentioning
confidence: 98%
“…When the base of the logarithm in Equation (1) is two, the unit of measurement is bits (binary digits) (3,4). Shannon entropy can be used to quantify diagnostic uncertainty.…”
Section: Methodsmentioning
confidence: 99%
“…The difference between the a priori uncertainty and the expected value of the a posteriori uncertainty gives the gain in information provided by the diagnostic test. It is referred to as the diagnostic information or IC of the diagnostic test (4)(5)(6). This difference is also called mutual information.…”
Section: Methodsmentioning
confidence: 99%
“…In this study, we use a measure of predictive accuracy for this domain called "Mutual Information" (MI), which is a measure of the information that one variable provides about the other [13,14]. MI does not exhibit the limitations of PPV and logistic regression because it can quantify not only the relationship between descriptors and breast cancer risk, but also the relationship between imaging observation features (including all descriptors belonging to that feature) and risk.…”
Section: Introductionmentioning
confidence: 99%