2021
DOI: 10.5194/gmd-14-1493-2021
|View full text |Cite
|
Sign up to set email alerts
|

Using Shapley additive explanations to interpret extreme gradient boosting predictions of grassland degradation in Xilingol, China

Abstract: Abstract. Machine learning (ML) and data-driven approaches are increasingly used in many research areas. Extreme gradient boosting (XGBoost) is a tree boosting method that has evolved into a state-of-the-art approach for many ML challenges. However, it has rarely been used in simulations of land use change so far. Xilingol, a typical region for research on serious grassland degradation and its drivers, was selected as a case study to test whether XGBoost can provide alternative insights that conventional land-… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
11
0
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 39 publications
(12 citation statements)
references
References 82 publications
0
11
0
1
Order By: Relevance
“…The feature importance and the relationship between the features and prediction were analyzed for the best-performing model, XGBoost. Although the XGBoost provides built-in functions to evaluate the importance of the input features, the built-in functions have the disadvantage of generating a different ranking of features due to the random components in the algorithm . The SHAP method, a method of explaining machine learning based on Shapley value, could not only overcome this drawback but also provide information about the relationship between the input features and target value.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…The feature importance and the relationship between the features and prediction were analyzed for the best-performing model, XGBoost. Although the XGBoost provides built-in functions to evaluate the importance of the input features, the built-in functions have the disadvantage of generating a different ranking of features due to the random components in the algorithm . The SHAP method, a method of explaining machine learning based on Shapley value, could not only overcome this drawback but also provide information about the relationship between the input features and target value.…”
Section: Methodsmentioning
confidence: 99%
“…Although the XGBoost provides built-in functions to evaluate the importance of the input features, the built-in functions have the disadvantage of generating a different ranking of features due to the random components in the algorithm. 30 The SHAP method, a method of explaining machine learning based on Shapley value, could not only overcome this drawback but also provide information about the relationship between the input features and target value. Therefore, the SHAP method was employed to obtain the main influence factors on water flux and to explore the relationship between the input features and water flux in the FO process.…”
Section: Hyperparameters Optimization With Genetic Algorithm (Ga)mentioning
confidence: 99%
“…Although the LSTM model can predict vegetation response reflected by ET, the model does not explain how the 25 input features have contributed to ET prediction. Here we use a game-theoretic approach, called SHapley Additive exPlanations (SHAP; , to interpret the LSTM model's output and to analyze relationships hidden in the black-box model (Batunacun et al, 2021;Vega García and Aznarte, 2020). Simply put, an input feature with a larger SHAP value relatively contributes to higher output, allowing us to identify possible drivers for ET reduction or tree die-off during droughts.…”
Section: Interpretable Deep-learning Model For Water-stress Predictionmentioning
confidence: 99%
“…In this sense, teachers' role in the educational process can be contextualized through several characteristics and/or attributes such as good communication, creativity, organization, commitment, planning and content knowledge, as well as personal attributes like age, gender and beliefs. All these attributes interact with environmental factors [19], i.e., workplace infrastructure, number of students in each classroom, number of schools taught by each teacher, and teacher training quality, which influence their daily practice, manifested in student learning processes. In addition, with regards to teachers' knowledge in the field of information technology, the research [20] has shown that teachers' digital competence (TDC) is an important condition for an effective integration of technologies in education.…”
Section: Figure 1 Brazilian Basic Education Development Index (Ideb) ...mentioning
confidence: 99%