Proceedings of the 2004 IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2004. CVPR 2004.
DOI: 10.1109/cvpr.2004.1315209
|View full text |Cite
|
Sign up to set email alerts
|

Invariant operators, small samples, and the bias-variance dilemma

Abstract: Abstract

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 13 publications
(19 reference statements)
0
2
0
Order By: Relevance
“…The trick is to find a change of basis that makes our computation simple. Note that an invertible transformation in the feature space does not change the Bayes rate [30], and therefore we can compute P B (E) in the transformed space.…”
Section: Appendix Amentioning
confidence: 99%
See 1 more Smart Citation
“…The trick is to find a change of basis that makes our computation simple. Note that an invertible transformation in the feature space does not change the Bayes rate [30], and therefore we can compute P B (E) in the transformed space.…”
Section: Appendix Amentioning
confidence: 99%
“…With some abuse of language, we will say that a classifier is Bayesian if it is generated by an algorithm that is asymptotically unbiased at each x. much better (in terms of misclassification rate) than texture, while the opposite is true for the data set ''Trees''. We used non-normalized (r, g, b) color features in our experiments (see [30] for a comparison of normalized and non-normalized colors in terms of misclassification rate). For texture, we used standard Gabor features [23].…”
Section: Visual Featuresmentioning
confidence: 99%