2017
DOI: 10.1016/j.jcss.2017.06.003
|View full text |Cite
|
Sign up to set email alerts
|

L-norm Sauer–Shelah lemma for margin multi-category classifiers

Abstract: In the framework of agnostic learning, one of the main open problems of the theory of multi-category pattern classification is the characterization of the way the complexity varies with the number C of categories. More precisely, if the classifier is characterized only through minimal learnability hypotheses, then the optimal dependency on C that an upper bound on the probability of error should exhibit is unknown. We consider margin classifiers. They are based on classes of vector-valued functions with one co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
48
0

Year Published

2018
2018
2024
2024

Publication Types

Select...
5

Relationship

2
3

Authors

Journals

citations
Cited by 14 publications
(50 citation statements)
references
References 30 publications
2
48
0
Order By: Relevance
“…Realizing it at the level of a Rademacher complexity, a linear dependency on C was obtained in [8]. In this paper, as in [9], we showed that postponing it to the level of metric entropy, this dependency can be improved to a sublinear one. The case that remains to be studied is a decomposition at the level of a combinatorial dimension, more precisely, at that of the fat-shattering dimension.…”
Section: Discussionsupporting
confidence: 54%
See 4 more Smart Citations
“…Realizing it at the level of a Rademacher complexity, a linear dependency on C was obtained in [8]. In this paper, as in [9], we showed that postponing it to the level of metric entropy, this dependency can be improved to a sublinear one. The case that remains to be studied is a decomposition at the level of a combinatorial dimension, more precisely, at that of the fat-shattering dimension.…”
Section: Discussionsupporting
confidence: 54%
“…Finally, a combinatorial bound gives an estimate on the metric entropy in terms of the combinatorial dimension. The metric entropy bound used in [9] is the L 2 -norm one of [12], which in this paper we generalized to L p -norms with integer p > 2. This generalization resulted in an improved dependency on the number C of categories compared to [9] without worsening the dependency on the sample size m nor the one on the margin parameter γ.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations