2020
DOI: 10.1016/j.jag.2020.102051
|View full text |Cite
|
Sign up to set email alerts
|

An evaluation of Guided Regularized Random Forest for classification and regression tasks in remote sensing

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
38
0
1

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2
1
1

Relationship

0
8

Authors

Journals

citations
Cited by 81 publications
(43 citation statements)
references
References 47 publications
0
38
0
1
Order By: Relevance
“…In general, the overall accuracies achieved were above 90% for all scenarios (combined and selected variables using GRRF). However, the higher accuracies of all combined scenarios (C1-C8) have been treated with caution because of the effects of multicollinearity that can lead to overfitting [58]. We thus placed more emphasis on the performance of the GRRF selected scenarios (C1selected-C8selected) which have the capabilities of reducing multidimensionality and the expected multicollinearity by maintaining the most relevant variables for the analysis.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…In general, the overall accuracies achieved were above 90% for all scenarios (combined and selected variables using GRRF). However, the higher accuracies of all combined scenarios (C1-C8) have been treated with caution because of the effects of multicollinearity that can lead to overfitting [58]. We thus placed more emphasis on the performance of the GRRF selected scenarios (C1selected-C8selected) which have the capabilities of reducing multidimensionality and the expected multicollinearity by maintaining the most relevant variables for the analysis.…”
Section: Discussionmentioning
confidence: 99%
“…To counter for the limitations of RRF, GRRF assigns a penalty coefficient to each feature by changing the importance coefficient of gamma value (γ ∈ (0, 1)), which controls the weight of normalized importance. This ensures that the most relevant variables are retained while still ensuring accurate classification [58]. Thus, in this study γ=0.7 was used in the "CoefReg" function in the "caret" package in R software to select the most important variables [59].…”
Section: Predictor Variables Selection and Classificationmentioning
confidence: 99%
“…It was necessary to reduce the number of features used to train and apply the classifiers to ensure timely execution of those classifiers in GEE. To conduct feature reduction, we applied a Guided Regularized Random Forest (GRRF [39]) that is a feature selection algorithm based in Random Forest [40]. Then, we applied a Support Vector Machine [41] classifier with a Gaussian kernel to the top ten most accurate subsets of features returned by GRRF.…”
Section: Data Pre-processingmentioning
confidence: 99%
“…This approach selected the most informative feature for our classification task among similar features, e.g., NDVI and EVI. GRRF does not require a prior parameter setup and ensures the selection of non-redundant and informative features improving the classification accuracy when applying a Random Forest algorithm afterwards, although a small negative impact on accuracy has been reported for regression tasks [39]. On the other hand, SVM is a robust and widely applied machine learning algorithm that generalizes well when the number of input features is large, and the training data are limited [41,42].…”
Section: Data Pre-processingmentioning
confidence: 99%
“…Precision, sensitivity, specificity, G-Mean= sqrt (Sensitivity×Specificity), F1-Score, area under ROC curve AUC and other parameters together were used as predictive evaluation indicators [22] . In Table 3 and Figure 4, 7 classification models are selected for comparison, covering probability model, tree model, linear model, ensemble model and neural network model.It comprehensively reflects the performance of the research objects in different classification models and the ensemble model has the best and most stable effect, in this paper.…”
Section: Comprehensive Evaluation Indexmentioning
confidence: 99%