2017 12th International Workshop on Self-Organizing Maps and Learning Vector Quantization, Clustering and Data Visualization (W 2017
DOI: 10.1109/wsom.2017.8020034
|View full text |Cite
|
Sign up to set email alerts
|

Rademacher complexity of margin multi-category classifiers

Abstract: One of the main open problems of the theory of margin multicategory pattern classification is the characterization of the optimal dependence of the confidence interval of a guaranteed risk on the three basic parameters which are the sample size m, the number C of categories and the scale parameter γ. This is especially the case when working under minimal learnability hypotheses. The starting point is a basic supremum inequality whose capacity measure depends on the choice of the margin loss function. Then, tra… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
44
0

Year Published

2017
2017
2019
2019

Publication Types

Select...
5

Relationship

2
3

Authors

Journals

citations
Cited by 18 publications
(44 citation statements)
references
References 15 publications
0
44
0
Order By: Relevance
“…The bound of Theorem 5.2 compares well with the margin-based bound from [11]. It varies as 1/ √ m, while that of [11] has an additional √ ln m factor.…”
Section: Generalization Error Boundsmentioning
confidence: 67%
See 4 more Smart Citations
“…The bound of Theorem 5.2 compares well with the margin-based bound from [11]. It varies as 1/ √ m, while that of [11] has an additional √ ln m factor.…”
Section: Generalization Error Boundsmentioning
confidence: 67%
“…The dependence on C is, however, √ C whereas in [11] it is √ ln 2 C. The bounds are not directly comparable because ours concerns width and that of [11] involves margin, but, for fixed C, it is notable that the m-dependence of our bound is better. The dependence of Theorem 5.2 on the width parameter γ is, in general, similar to the dependence of [11] on the margin parameter γ as both grow like √ N γ where N γ is the covering number of the underlying realvalued discriminant function class. The advantage of the bound of Theorem 5.2 is that it is expressed in terms of the covering number of the actual metric space which, in some problems, such as when the metric space is finite [5], can be efficiently estimated.…”
Section: Generalization Error Boundsmentioning
confidence: 68%
See 3 more Smart Citations