2020
DOI: 10.3390/ijgi9090507
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating Variable Selection and Machine Learning Algorithms for Estimating Forest Heights by Combining Lidar and Hyperspectral Data

Abstract: Machine learning has been employed for various mapping and modeling tasks using input variables from different sources of remote sensing data. For feature selection involving high- spatial and spectral dimensionality data, various methods have been developed and incorporated into the machine learning framework to ensure an efficient and optimal computational process. This research aims to assess the accuracy of various feature selection and machine learning methods for estimating forest height using AISA (airb… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
13
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
8

Relationship

1
7

Authors

Journals

citations
Cited by 31 publications
(19 citation statements)
references
References 83 publications
1
13
0
Order By: Relevance
“…One was the random forest algorithm (RF), an ensemble ML that combines multiple decision trees created from bootstrapped datasets and considering subsets of variables in each step. The other algorithm used was the extreme gradient boosting tree (XGBTree), a regularized gradient boosting known for its efficiency and high performance [44]. The algorithm builds trees in sequence by predicting the residual based on the previous prediction, then prunes and adds to the previous trees based on adjustable hyperparameters.…”
Section: Machine Learning Algorithmsmentioning
confidence: 99%
See 1 more Smart Citation
“…One was the random forest algorithm (RF), an ensemble ML that combines multiple decision trees created from bootstrapped datasets and considering subsets of variables in each step. The other algorithm used was the extreme gradient boosting tree (XGBTree), a regularized gradient boosting known for its efficiency and high performance [44]. The algorithm builds trees in sequence by predicting the residual based on the previous prediction, then prunes and adds to the previous trees based on adjustable hyperparameters.…”
Section: Machine Learning Algorithmsmentioning
confidence: 99%
“…An additional implementation of XGB, the XGBDart, is also utilized in this study to provide additional insights as the dart base learner removes trees (dropout) during each round of boosting, allowing for more control over potential overfitting problems. The models have been chosen for their known high performance with nonlinear relationships, predictive ability and wide use in ecological models and forest application at fine, regional and global scales, e.g., [44][45][46][47][48].…”
Section: Machine Learning Algorithmsmentioning
confidence: 99%
“…The samples collected for constructing the machine learning model in Figure 4, were divided into a training dataset (80%) and a test dataset (20%). In addition, prior to conducting the analysis, a variable selection step was conducted to filter out the insignificant variables by using Boruta [33], which was able to filter out the high-dimensional variables to obtain only the important variables [34]. In addition, hyperparameter tuning was conducted in this study by using grid search for RF and random search for ET and XGB, since more parameters are needed for those algorithms.…”
Section: Machine Learning Classificationmentioning
confidence: 99%
“…The RF, SVR-linear and SVR-RBF algorithms were selected to predict forest AGB with GLAS data in this study. RF is a tree-based ensemble algorithm and is regarded as one of the best machine learning algorithms to estimate forest AGB due to its high predictive accuracy and high computation speed [66,67]. However, RF tends to overfit noisy regression problems [68].…”
Section: A Performances Of Agb Modeling Algorithmsmentioning
confidence: 99%
“…SVR, due to its excellent performance even with limited training samples, is also widely used in remote sensing fields [69]. In most studies, SVR is mainly referred to as SVR-RBF [67,70]. The RBF kernel was chosen for the SVR algorithm because it has been shown to be effective for forest parameter retrieval [24].…”
Section: A Performances Of Agb Modeling Algorithmsmentioning
confidence: 99%