2012
DOI: 10.2747/1548-1603.49.5.623
|View full text |Cite
|
Sign up to set email alerts
|

An Evaluation of Bagging, Boosting, and Random Forests for Land-Cover Classification in Cape Cod, Massachusetts, USA

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

5
91
0
1

Year Published

2013
2013
2021
2021

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 184 publications
(97 citation statements)
references
References 58 publications
5
91
0
1
Order By: Relevance
“…Furthermore, the utilization of non-parametric classification algorithms has proven to be an effective tool for mapping land cover using high number of input variables. Ensemble classification trees such as Random Forest are iterative and have been found to produce higher mapping accuracies and stable classification results compared to both parametric classifiers as well as non-iterative classification trees [36][37][38].…”
Section: Introductionmentioning
confidence: 99%
“…Furthermore, the utilization of non-parametric classification algorithms has proven to be an effective tool for mapping land cover using high number of input variables. Ensemble classification trees such as Random Forest are iterative and have been found to produce higher mapping accuracies and stable classification results compared to both parametric classifiers as well as non-iterative classification trees [36][37][38].…”
Section: Introductionmentioning
confidence: 99%
“…In addition, random forest provides the relative importance of a variable using out-of-bag data when the variable is permuted. Because of these strengths, random forest has proven robust in various remote sensing applications [61][62][63][64][65][66][67][68].…”
Section: Machine Learning Algorithms For Lead Detectionmentioning
confidence: 99%
“…Popular MCS combination techniques include Bagging, Boosting, random forests, and AdaBoost with iterative and convergent nature [46,[52][53][54][55][56]. To obtain more base classifiers with differences, Ghimire and Rogan [54] performed land use/cover classification in a heterogeneous landscape in Massachusetts by comparing three combining techniques, i.e., bagging, boosting, and random, with decision tree algorithm, and their results showed that the MCS performed better than the decision tree classifier.…”
Section: Introductionmentioning
confidence: 99%
“…To obtain more base classifiers with differences, Ghimire and Rogan [54] performed land use/cover classification in a heterogeneous landscape in Massachusetts by comparing three combining techniques, i.e., bagging, boosting, and random, with decision tree algorithm, and their results showed that the MCS performed better than the decision tree classifier. Based on SVM, Khosravi and Beigi [55] used bagging and AdaBoost to combine a MCS to classify a hyperspectral dataset, and their work has showed a high capability of MCS in classifying high dimensionality data.…”
Section: Introductionmentioning
confidence: 99%