2013 IEEE Information Theory Workshop (ITW) 2013
DOI: 10.1109/itw.2013.6691302
|View full text |Cite
|
Sign up to set email alerts
|

Novel tight classification error bounds under mismatch conditions based on f-Divergence

Abstract: By default, statistical classification/multiple hypothesis testing is faced with the model mismatch introduced by replacing the true distributions in Bayes decision rule by model distributions estimated on training samples. Although a large number of statistical measures exist w.r.t. to the mismatch introduced, these works rarely relate to the mismatch in accuracy, i.e. the difference between model error and Bayes error. In this work, the accuracy mismatch between the ideal Bayes decision rule/Bayes test and a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
19
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
3
2

Relationship

1
4

Authors

Journals

citations
Cited by 6 publications
(19 citation statements)
references
References 8 publications
0
19
0
Order By: Relevance
“…This characterization leads to sufficient conditions for perfect classification in the low-noise regime, that can be expressed via the principal angles associated with the subspaces of the true or mismatched model as well as the dimension of the overlap between such subspaces. In contrast, the bound proposed in [10] does not exhibit a phase transition in the lownoise regime. Furthermore, the bound in [10] is a function of the differences between the true and the mismatched models of each class and neglects the interplay between the mismatched models of different classes.…”
Section: Introductionmentioning
confidence: 64%
See 2 more Smart Citations
“…This characterization leads to sufficient conditions for perfect classification in the low-noise regime, that can be expressed via the principal angles associated with the subspaces of the true or mismatched model as well as the dimension of the overlap between such subspaces. In contrast, the bound proposed in [10] does not exhibit a phase transition in the lownoise regime. Furthermore, the bound in [10] is a function of the differences between the true and the mismatched models of each class and neglects the interplay between the mismatched models of different classes.…”
Section: Introductionmentioning
confidence: 64%
“…In contrast, the bound proposed in [10] does not exhibit a phase transition in the lownoise regime. Furthermore, the bound in [10] is a function of the differences between the true and the mismatched models of each class and neglects the interplay between the mismatched models of different classes. On the other hand, the proposed bound also captures the interplay between the geometry of the mismatched models of different classes.…”
Section: Introductionmentioning
confidence: 64%
See 1 more Smart Citation
“…In particular, the work in [23] is closely related to our work in the sense that it also establishes bounds to the error probability in the presence of mismatch. The bounds presented in [23] are more general since they do not assume a particular form of probability density functions. Our work, on the other hand, leverages the assumption that signals are contained in linear subspaces in order to derive an upper bound that sharply predicts the presence or absence of an error floor.…”
Section: A Related Workmentioning
confidence: 83%
“…Our work, on the other hand, leverages the assumption that signals are contained in linear subspaces in order to derive an upper bound that sharply predicts the presence or absence of an error floor. The bounds in [23] fail to capture the presence or absence of an error floor when specialized to the proposed signal model.…”
Section: A Related Workmentioning
confidence: 99%