Proceedings of the 19th ACM International Conference on Information and Knowledge Management 2010
DOI: 10.1145/1871437.1871758
|View full text |Cite
|
Sign up to set email alerts
|

BagBoo

Abstract: In this paper, we introduce a novel machine learning approach for regression based on the idea of combining bagging and boosting that we call BagBoo. Our BagBoo model borrows its high accuracy potential from Friedman's gradient boosting [2], and high efficiency and scalability through parallelism from Breiman's bagging [1]. We run empirical evaluations on large scale Web ranking data, and demonstrate that BagBoo is not only showing superior relevance than standalone bagging or boosting, but also outperforms mo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
1
0

Year Published

2015
2015
2023
2023

Publication Types

Select...
4
1

Relationship

0
5

Authors

Journals

citations
Cited by 19 publications
(1 citation statement)
references
References 6 publications
0
1
0
Order By: Relevance
“…For the Despite being different approaches, boosting and bagging are not incompatible, in fact, they can be used jointly, however, there is not a unique solution for how to combine them. In most cases, it is based on making bagging of boosting ensembles or boosting of bagging ensembles [14,15]. Even though they give good results, in the majority of the cases there is not a mathematical interpretation of the learning representation.…”
mentioning
confidence: 99%
“…For the Despite being different approaches, boosting and bagging are not incompatible, in fact, they can be used jointly, however, there is not a unique solution for how to combine them. In most cases, it is based on making bagging of boosting ensembles or boosting of bagging ensembles [14,15]. Even though they give good results, in the majority of the cases there is not a mathematical interpretation of the learning representation.…”
mentioning
confidence: 99%