2006
DOI: 10.1007/s10994-006-7679-y
|View full text |Cite
|
Sign up to set email alerts
|

Model selection by bootstrap penalization for classification

Abstract: We consider the binary classification problem. Given an i.i.d. sample drawn from the distribution of an X × {0, 1}−valued random pair, we propose to estimate the so-called Bayes classifier by minimizing the sum of the empirical classification error and a penalty term based on Efron's or i.i.d. weighted bootstrap samples of the data. We obtain exponential inequalities for such bootstrap type penalties, which allow us to derive non-asymptotic properties for the corresponding estimators. In particular, we prove t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
24
0

Year Published

2008
2008
2021
2021

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 18 publications
(25 citation statements)
references
References 38 publications
1
24
0
Order By: Relevance
“…Second, the global Rademacher complexities were introduced in order to obtain theoretically validated model selection procedures in classification [45,17]. They are resampling estimates of pen id,g (m) := sup t∈Sm { (P − P n )γ(t) } ≥ (P − P n )γ ( s m (P n ) ) = pen id (m), with Rad weights; more recently, Fromont [34] generalized global Rademacher complexities to a wide family of exchangeable resampling weights and obtained non-asymptotic oracle inequalities. Nevertheless, global complexities (that is, estimates of pen id,g ) are too large compared to pen id so that they cannot achieve fast rates of estimation when the margin condition [53] holds.…”
Section: Related Penalties For Classificationmentioning
confidence: 99%
See 1 more Smart Citation
“…Second, the global Rademacher complexities were introduced in order to obtain theoretically validated model selection procedures in classification [45,17]. They are resampling estimates of pen id,g (m) := sup t∈Sm { (P − P n )γ(t) } ≥ (P − P n )γ ( s m (P n ) ) = pen id (m), with Rad weights; more recently, Fromont [34] generalized global Rademacher complexities to a wide family of exchangeable resampling weights and obtained non-asymptotic oracle inequalities. Nevertheless, global complexities (that is, estimates of pen id,g ) are too large compared to pen id so that they cannot achieve fast rates of estimation when the margin condition [53] holds.…”
Section: Related Penalties For Classificationmentioning
confidence: 99%
“…When σ min = 0 in (An), Lemma 8 below proves that (A Q ) also holds with D 0 = L (Bg) . Therefore, using (33), (A m,ℓ ) holds with the same D 0 .…”
Section: No Uniform Lower Bound On the Noise-levelmentioning
confidence: 99%
“…Models can also be selected by means of regularization methods, that is, they are penalizing depending on the number of parameters (Alpaydin, 2004) (Fromont, 2007). Generally, Bayesian learning techniques make use of knowledge on the prior probability distributions in order to assign lower probabilities to models that are more complicated.…”
Section: Introductionmentioning
confidence: 99%
“…Method 1 above is closely related to the Rademacher complexity approach in learning theory, and our results in this direction are heavily inspired by the work of Fromont [Fro04], who studies general resampling schemes in a learning theoretical setting. It may also be seen as a generalization of crossvalidation methods.…”
Section: The 1 − α Quantile Of the Distribution Of φ Y [ W −W ] Condimentioning
confidence: 99%
“…2. When Y is not assumed to be symmetric and W = 1 a.s., Proposition 2 in [Fro04] shows that (9) holds with E(W 1 − W ) + instead of A W . Therefore, the symmetry of the sample allows us to get a tighter result (for instance twice sharper with Efron or Random hold-out (q) weights).…”
Section: Comparison In Expectationmentioning
confidence: 99%