2015
DOI: 10.1109/tpami.2014.2307877
|View full text |Cite
|
Sign up to set email alerts
|

Gentle Nearest Neighbors Boosting over Proper Scoring Rules

Abstract: Abstract-Tailoring nearest neighbors algorithms to boosting is an important problem. Recent papers study an approach, UNN, which provably minimizes particular convex surrogates under weak assumptions. However, numerical issues make it necessary to experimentally tweak parts of the UNN algorithm, at the possible expense of the algorithm's convergence and performance. In this paper, we propose a lightweight alternative algorithm optimizing proper scoring rules from a very broad set, and establish formal converge… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
4
0

Year Published

2016
2016
2023
2023

Publication Types

Select...
3
2
1

Relationship

2
4

Authors

Journals

citations
Cited by 6 publications
(4 citation statements)
references
References 32 publications
0
4
0
Order By: Relevance
“…Some exemplary (relatively) recent applications of generally non-separable (ordinary/classical) Bregman distances appear e.g. in Jiao et al [206], Varshney & Varshney [324], Hu et al [325], Nock et al [326], Raskutti & Mukherjee [327], Wang et al [328], He et al [329], Li et al [330], Harremoës [331], Xu et al [332], Halder [333], Zhang et al [334], Shao et al [335], Tembine [336], Brécheteau et al [337], Lin et al [338], Yuan et al [339], Azizan et al [340], Dytso et al [341], Gruzdeva & Ushakov [342], Song et al [343], Tan & Zhang [245], Yu et al [344], Capó et al [345], Chen et al [346], Fernández-Rodriguez [347], Hayashi [348], Li & Ralescu [349], Xiong et al [350], Liu et al [351].…”
Section: A Further Divergences and Friendsmentioning
confidence: 99%
“…Some exemplary (relatively) recent applications of generally non-separable (ordinary/classical) Bregman distances appear e.g. in Jiao et al [206], Varshney & Varshney [324], Hu et al [325], Nock et al [326], Raskutti & Mukherjee [327], Wang et al [328], He et al [329], Li et al [330], Harremoës [331], Xu et al [332], Halder [333], Zhang et al [334], Shao et al [335], Tembine [336], Brécheteau et al [337], Lin et al [338], Yuan et al [339], Azizan et al [340], Dytso et al [341], Gruzdeva & Ushakov [342], Song et al [343], Tan & Zhang [245], Yu et al [344], Capó et al [345], Chen et al [346], Fernández-Rodriguez [347], Hayashi [348], Li & Ralescu [349], Xiong et al [350], Liu et al [351].…”
Section: A Further Divergences and Friendsmentioning
confidence: 99%
“…Although there are many machine learning methods for feature selection such as LASSO [ 14 , 15 ], Discriminant analysis [ 16 ], Proximal methods [ 17 , 18 ] and Boosting [ 19 , 20 ], here we compare our novel Primal-Dual method for Classification with Rejection (PD-CR) to the state of the art PLS-DA and Random Forests and SVM classification methods frequently used in metabolomics studies.…”
Section: Introductionmentioning
confidence: 99%
“…Although there are many machine learning methods for feature selection such as support vector machines (SVM) [11], LASSO [12,13], Discriminant analysis [14], Proximal methods [15,16], Boosting [17,18], we compare here our novel Primal-Dual method for Classification with Rejection (PD-CR) to the state of the art PLS-DA and Random Forests classification methods used in metabolomics studies, available with the popular Metaboanalyst 5.0 (www.metaboanalyst.ca) [19].…”
Section: Introductionmentioning
confidence: 99%