2015
DOI: 10.1016/j.neunet.2015.02.006
|View full text |Cite
|
Sign up to set email alerts
|

Local Rademacher Complexity: Sharper risk bounds with and without unlabeled samples

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

1
48
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
4
3

Relationship

1
6

Authors

Journals

citations
Cited by 33 publications
(49 citation statements)
references
References 21 publications
1
48
0
Order By: Relevance
“…In this section, we will show the three main results in the context of the complexity‐based methods: The first approaches deal with the problem of finite‐sized sets of rules: the Union Bound method (Bonferroni, ; Vapnik, ), which takes into account the whole set of rules, and the shell bound method (Langford & McAllester, , ), which takes into account just the rules with small empirical error; The second approaches are based on the seminal work of V. N. Vapnik and A. Chernovenkis and deal with infinite‐sized sets of rules for the particular case of binary classification: the VC theory (Vapnik, ), which takes into account the whole set of rules, and the local VC theory (Oneto, Anguita et al, ) which takes into account just the rules with small empirical error. Extensions to the general SL framework have been proposed over the years (Bartlett, Kulkarni, & Posner, ; Shawe‐Taylor et al, ; Vapnik, ; Zhou, ), but were overly complicated and eventually made obsolete by the Rademacher complexity theory; The last approach is the Rademacher complexity theory which deals with infinite‐sized sets of rules and the general SL framework: the global Rademacher complexity theory (Bartlett & Mendelson, ; Koltchinskii, ; Oneto et al, ; Oneto, Ghio et al, ), which takes into account the whole set of rules, and the local Rademacher complexity theory (Bartlett et al, ; Bartlett, Bousquet, & Mendelson, ; Koltchinskii, ; Lugosi & Wegkamp, ; Oneto, Ghio et al, ) which takes into account just the rules with small empirical error. …”
Section: Complexity‐based Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…In this section, we will show the three main results in the context of the complexity‐based methods: The first approaches deal with the problem of finite‐sized sets of rules: the Union Bound method (Bonferroni, ; Vapnik, ), which takes into account the whole set of rules, and the shell bound method (Langford & McAllester, , ), which takes into account just the rules with small empirical error; The second approaches are based on the seminal work of V. N. Vapnik and A. Chernovenkis and deal with infinite‐sized sets of rules for the particular case of binary classification: the VC theory (Vapnik, ), which takes into account the whole set of rules, and the local VC theory (Oneto, Anguita et al, ) which takes into account just the rules with small empirical error. Extensions to the general SL framework have been proposed over the years (Bartlett, Kulkarni, & Posner, ; Shawe‐Taylor et al, ; Vapnik, ; Zhou, ), but were overly complicated and eventually made obsolete by the Rademacher complexity theory; The last approach is the Rademacher complexity theory which deals with infinite‐sized sets of rules and the general SL framework: the global Rademacher complexity theory (Bartlett & Mendelson, ; Koltchinskii, ; Oneto et al, ; Oneto, Ghio et al, ), which takes into account the whole set of rules, and the local Rademacher complexity theory (Bartlett et al, ; Bartlett, Bousquet, & Mendelson, ; Koltchinskii, ; Lugosi & Wegkamp, ; Oneto, Ghio et al, ) which takes into account just the rules with small empirical error. …”
Section: Complexity‐based Methodsmentioning
confidence: 99%
“…As an example, data‐dependent versions of the VC theory have been developed in Boucheron et al () and Shawe‐Taylor et al () studies. In recent years, researchers have also succeeded in developing local data‐dependent complexity measures (Bartlett et al, ; Bartlett, Bousquet, & Mendelson, ; Blanchard & Massart, ; Cortes, Kloft, & Mohri, ; Koltchinskii, ; Oneto, Anguita et al, ; Oneto, Ghio et al, ). Local measures improve over global ones thanks to their ability of taking into account only those rules of the rules class that will be most likely chosen by the learning procedure, that is, the models with small error.…”
Section: Complexity‐based Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The classification errors yielded with each part are then averaged, and the parameterisation with the least classification error is selected [52]. Although there are studies developing methods to determine these parameters (e.g., [53,54]), the cross-validation method is still the method adopted by the majority of data analysts [51,52].…”
Section: Free-parameter Tuningmentioning
confidence: 99%
“…We demonstrate the power of our complexity bounds by applying them to derive effective generalization error bounds. from the true errors simultaneously over the whole class, while the quantity of primary importance is only that deviation for the particular function picked by the learning algorithm, which may be far from reaching this supremum [2,16,26]. Therefore, the analysis based on a global complexity would give a rather conservative estimate.…”
mentioning
confidence: 99%