2016
DOI: 10.14209/jcis.2016.26
|View full text |Cite
|
Sign up to set email alerts
|

On the minimum probability of classification error through effective cardinality comparison

Abstract: Abstract-This work proposes a method for estimating a lower (Bayesian) bound for classification error rate in two-class problems. This lower bound is typically inferred through specific classifier structures. By contrast, the proposed approach is based on "collision" (quadratic) entropy estimator, deployed in the very pragmatic form of simple coincidence counters. To properly introduce this new approach, we first discuss the concept of sets' effective cardinality in view of basic concepts of probability and se… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
3
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

1
0

Authors

Journals

citations
Cited by 1 publication
(3 citation statements)
references
References 16 publications
(48 reference statements)
0
3
0
Order By: Relevance
“…These connections to the Shannon's seminal work are further discussed in the Appendix. Notice that, as in [18], accuracies can be alternatively thought in terms of entropy, through the concept of effective cardinality, as illustrated in the Appendix. Accordingly, any classification/detection problem, with a given accuracy, A, is analogous to another problem of finding a single target element in a chimeric set of C = 1/A equally likely ones, where C is the effective cardinality of the chimeric set, and H = log 2 (C) is the corresponding entropy, or its average information content, in bits.…”
Section: Measuring Information In Analogy Testsmentioning
confidence: 99%
See 2 more Smart Citations
“…These connections to the Shannon's seminal work are further discussed in the Appendix. Notice that, as in [18], accuracies can be alternatively thought in terms of entropy, through the concept of effective cardinality, as illustrated in the Appendix. Accordingly, any classification/detection problem, with a given accuracy, A, is analogous to another problem of finding a single target element in a chimeric set of C = 1/A equally likely ones, where C is the effective cardinality of the chimeric set, and H = log 2 (C) is the corresponding entropy, or its average information content, in bits.…”
Section: Measuring Information In Analogy Testsmentioning
confidence: 99%
“…Entropy measures reflect incertitudes about w b , and are theoretically related to accuracies [18]. Indeed, average accuracies can be expressed as…”
Section: Measuring Information In Analogy Testsmentioning
confidence: 99%
See 1 more Smart Citation