2012
DOI: 10.1109/tkde.2011.208
|View full text |Cite
|
Sign up to set email alerts
|

Scalable and Parallel Boosting with MapReduce

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
44
0

Year Published

2013
2013
2023
2023

Publication Types

Select...
7
2
1

Relationship

0
10

Authors

Journals

citations
Cited by 84 publications
(44 citation statements)
references
References 25 publications
0
44
0
Order By: Relevance
“…In order to obtain the exact solution, all nodes in the cluster are required to evaluate the potential split points found by all other nodes. Palit and Reddy [98] utilize the MapReduce framework for developing two parallel boosting algorithms: AdaBoost.PL and LogitBoost.PL which are competitive to their corresponding serial versions in terms of predicative performance. Theses algorithms need only one cycle of MapReduce to complete the algorithms.…”
Section: Scaling Up Decision Forests Methodsmentioning
confidence: 99%
“…In order to obtain the exact solution, all nodes in the cluster are required to evaluate the potential split points found by all other nodes. Palit and Reddy [98] utilize the MapReduce framework for developing two parallel boosting algorithms: AdaBoost.PL and LogitBoost.PL which are competitive to their corresponding serial versions in terms of predicative performance. Theses algorithms need only one cycle of MapReduce to complete the algorithms.…”
Section: Scaling Up Decision Forests Methodsmentioning
confidence: 99%
“…Furthermore, several MapReduce implementations have been proposed for different classification algorithms such as cost-sensitive fuzzy rule based systems for imbalanced classification, 17 ensembles of classifiers 18,19 or Support Vector Machines 20 to mention a few.…”
Section: Mapreduce Programming Modelmentioning
confidence: 99%
“…Two parallel boosting algorithms, AD-ABOOST.PL and LOGITBOOST.PL were proposed by Indranil et al [9] which facilitate simultaneous participation of multiple computing nodes to construct a boosted ensemble classifier. These algorithms achieve a significant speedup and still are competitive to the corresponding serial versions in terms of the generalization performance.…”
Section: Related Workmentioning
confidence: 99%