2020
DOI: 10.3390/rs12121973
|View full text |Cite
|
Sign up to set email alerts
|

Meta-XGBoost for Hyperspectral Image Classification Using Extended MSER-Guided Morphological Profiles

Abstract: To investigate the performance of extreme gradient boosting (XGBoost) in remote sensing image classification tasks, XGBoost was first introduced and comparatively investigated for the spectral-spatial classification of hyperspectral imagery using the extended maximally stable extreme-region-guided morphological profiles (EMSER_MPs) proposed in this study. To overcome the potential issues of XGBoost, meta-XGBoost was proposed as an ensemble XGBoost method with classification and regression tree (CART), dropout-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
41
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 77 publications
(49 citation statements)
references
References 44 publications
0
41
0
Order By: Relevance
“…XGBoost is a boosting algorithm proposed by Chen et al in 2016 based on the GBDT and RF approaches [57]. Compared with GBDT, XGBoost has improvements in multithreaded processing, the classifier, and the optimization function [58]. It also has the following advantages [59,60]:…”
Section: Extreme Gradient Boostingmentioning
confidence: 99%
“…XGBoost is a boosting algorithm proposed by Chen et al in 2016 based on the GBDT and RF approaches [57]. Compared with GBDT, XGBoost has improvements in multithreaded processing, the classifier, and the optimization function [58]. It also has the following advantages [59,60]:…”
Section: Extreme Gradient Boostingmentioning
confidence: 99%
“…On the other hand, Chen and Guestrin [61] recently suggested XGB to improve the performance of DT. This model is also an ensemble learning method of DT, and appeared as the top model in various machine learning comparison studies [62,63]. The difference from RF is that XGB is based on the gradient boosting method.…”
Section: Dt-based Modelsmentioning
confidence: 99%
“…From Figure 3, it can be seen that when 'lambda' = 2, the average precision of CV stays at a lower level and remains the same in the interval 'max_ depth'= [2,4]. It starts to increase gradually and reaches its maximum value at 'max_depth' = 6 in [4,6]. When 'lambda' = 3, the average precision of CV stays at a lower level and remains the same in the interval 'max_depth' = [2,6].…”
Section: A Learning and Prediagnosis Of Xgboostmentioning
confidence: 99%
“…When 'lambda' = 3, the average precision of CV stays at a lower level and remains the same in the interval 'max_depth' = [2,6]. When 'lambda' = 4, the average precision of CV begins to decline gradually and reaches a minimum value at 'max_depth' = 6 in the interval 'max_depth' = [2,4]. It is at a lower level and remains the same level in [4,6].…”
Section: A Learning and Prediagnosis Of Xgboostmentioning
confidence: 99%
See 1 more Smart Citation