2020 IEEE International Conference on Big Data (Big Data) 2020
DOI: 10.1109/bigdata50022.2020.9378034
|View full text |Cite
|
Sign up to set email alerts
|

Evaluating MASHAP as a faster alternative to LIME for model-agnostic machine learning interpretability

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 8 publications
0
2
0
Order By: Relevance
“…In 2016, Lundberg et al proposed the SHAP method, which replaced the method of weighting samples according to their proximity to the original instance in LIME with the method of weighting the samples according to the weights obtained by the alliance in the Shapley value estimation. In 2020, Messalas et al proposed a MASHAP method [30]. It first builds a global proxy model on the interested instance, then transfers the proxy model as an original model to the Tree SHAP method, and then generates an explanation.…”
Section: Related Workmentioning
confidence: 99%
“…In 2016, Lundberg et al proposed the SHAP method, which replaced the method of weighting samples according to their proximity to the original instance in LIME with the method of weighting the samples according to the weights obtained by the alliance in the Shapley value estimation. In 2020, Messalas et al proposed a MASHAP method [30]. It first builds a global proxy model on the interested instance, then transfers the proxy model as an original model to the Tree SHAP method, and then generates an explanation.…”
Section: Related Workmentioning
confidence: 99%
“…LIME is a model-independent explanation approach that uses a local (explainable) intermediary model to explain any machine learning model [41].…”
Section: Local Interpretable Model-agnostic Explanations (Lime)mentioning
confidence: 99%
“…However, no algorithmic improvements have been made. In [23,24], MASHAP was proposed to compute SHAP values for arbitrary models in an efficient way, where an arbitrary model is first approximated by a surrogate XGBoost model, and TreeSHAP is then applied on this surrogate model to calculate SHAP values. The most related work of improving computational efficiency in TreeSHAP as far as we know is [25], where the authors presented GPUTreeShap as a GPU implementation of TreeSHAP algorithm.…”
Section: Related Workmentioning
confidence: 99%