2022
DOI: 10.1080/10106049.2022.2076928
|View full text |Cite
|
Sign up to set email alerts
|

An interpretable model for the susceptibility of rainfall-induced shallow landslides based on SHAP and XGBoost

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
30
0
1

Year Published

2022
2022
2023
2023

Publication Types

Select...
8
1

Relationship

3
6

Authors

Journals

citations
Cited by 81 publications
(32 citation statements)
references
References 78 publications
1
30
0
1
Order By: Relevance
“…Based on the training and validation datasets, the model successfully distinguishes the landslide-prone areas in the study area. Our results support previous studies showing that coupled models can significantly reduce overfitting and noise problems in the modeling process [ 13 , 23 , 54 , 57 , 58 , 59 , 60 , 61 , 62 ]. The novelty of our method is that we consider the combination of statistical models and machine learning models, which can perform well in solving the problem of poor performance of a single model.…”
Section: Discussionsupporting
confidence: 91%
“…Based on the training and validation datasets, the model successfully distinguishes the landslide-prone areas in the study area. Our results support previous studies showing that coupled models can significantly reduce overfitting and noise problems in the modeling process [ 13 , 23 , 54 , 57 , 58 , 59 , 60 , 61 , 62 ]. The novelty of our method is that we consider the combination of statistical models and machine learning models, which can perform well in solving the problem of poor performance of a single model.…”
Section: Discussionsupporting
confidence: 91%
“…The core idea of SHAP is to calculate the marginal contribution of features to the model output, and then interpret the “black box model” at both global and local levels. During model training or testing, a corresponding prediction value is generated for each sample, and the SHAP value is the corresponding value attributed to each feature in the sample ( 41 ). In the figure, the horizontal axis represent the SHAP values (the distribution of feature effects on the model output) and the vertical axis represent the feature ranking corresponding to the sum of the 400 sample SHAP values.…”
Section: Discussionmentioning
confidence: 99%
“…Zhou et al [140] proposes a novel interpretable model based on the Shapley additive explanation and XGBoost to interpret landslides susceptibility evaluation at global and local levels. The proposed model delivered 0.75 in accuracy and an AUC value of 0.83 for the test sets.…”
Section: Author Year Ensemble Methodsmentioning
confidence: 99%
“…It was developed to provide a more general approach to the probabilistic prediction method. XGBoost [129,140,141] belongs to the gradient boosting family and is an effective supervised classification model. It is a preferred ensemble technique mainly because of its ability to prevent over-fitting issues by using bagging-bootstrap aggregation.…”
Section: Ensemble Techniquesmentioning
confidence: 99%