2012
DOI: 10.1007/978-3-642-33460-3_26
|View full text |Cite
|
Sign up to set email alerts
|

Boosting Nearest Neighbors for the Efficient Estimation of Posteriors

Abstract: It is an admitted fact that mainstream boosting algorithms like AdaBoost do not perform well to estimate class conditional probabilities. In this paper, we analyze, in the light of this problem, a recent algorithm, unn, which leverages nearest neighbors while minimizing a convex loss. Our contribution is threefold. First, we show that there exists a subclass of surrogate losses, elsewhere called balanced, whose minimization brings simple and statistically efficient estimators for Bayes posteriors. Second, we s… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2012
2012
2015
2015

Publication Types

Select...
3
1
1

Relationship

4
1

Authors

Journals

citations
Cited by 5 publications
(9 citation statements)
references
References 18 publications
(53 reference statements)
0
9
0
Order By: Relevance
“…Suppose we pick δ e(t) as in (10), that is, δ e(t) = 2(1−ε)η(c, e(t))(H * ψ φ n e(t) ) −1 , for ε ∈ (0, 1). This choice yields:…”
Section: Gnnbmentioning
confidence: 99%
See 1 more Smart Citation
“…Suppose we pick δ e(t) as in (10), that is, δ e(t) = 2(1−ε)η(c, e(t))(H * ψ φ n e(t) ) −1 , for ε ∈ (0, 1). This choice yields:…”
Section: Gnnbmentioning
confidence: 99%
“…A previous approach in our line of works is algorithm UNN (for "Universal Nearest Neighbors"), which brings boosting guarantees for merely all strictly convex differentiable surrogates relevant to classification [9], [5], [6]. For a wide subset of surrogates, it yields simple and efficient estimators of posteriors [10].…”
Section: Introductionmentioning
confidence: 99%
“…We report estimator's formal definition for each loss function of unn in Table 2. The theorical approach for derivingp c (x) from h c (x) is fully given in [5]. We used one private and five public datasets, belonging to images classification problems of different biomedical domains.…”
Section: Universal Nearest Neighboursmentioning
confidence: 99%
“…Reliabilities of unn implementations tested measured in terms of posterior probabilitiesp c (x) are computed as reported in Table 2. For further details, the interested reader may refer to [5]. To estimate the posterior probabilities for SVM we use the method presented in [13].…”
Section: Classifier Reliabilitymentioning
confidence: 99%
See 1 more Smart Citation