2023
DOI: 10.1016/j.knosys.2022.110234
|View full text |Cite
|
Sign up to set email alerts
|

SurvSHAP(t): Time-dependent explanations of machine learning survival models

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
12
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 40 publications
(29 citation statements)
references
References 41 publications
0
12
0
Order By: Relevance
“…59 In comparison to the existing explainable methods for survival prediction models using LIME, 18,19,35 the presented approach allows for obtaining time-specific explanations and does not rely on a restrictive proportional hazards explanation model. Compared to SurvSHAP(t), 20 we also extend an existing explainer for classification to capture time-dependent effects in a survival model. However, we take advantage of LIME's fast estimation and its additive interpretability on the probability scale.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…59 In comparison to the existing explainable methods for survival prediction models using LIME, 18,19,35 the presented approach allows for obtaining time-specific explanations and does not rely on a restrictive proportional hazards explanation model. Compared to SurvSHAP(t), 20 we also extend an existing explainer for classification to capture time-dependent effects in a survival model. However, we take advantage of LIME's fast estimation and its additive interpretability on the probability scale.…”
Section: Discussionmentioning
confidence: 99%
“…SurvSHAP(t) is a SHAP-based explainer that allows for predictors to have a time-varying explanation. 20 The methodology extends SHAP to survival models by estimating an attribution for each predictor at each time point. Thus, each time-specific SHAP value describes the time-dependent influence of the predictor on the prediction.…”
Section: Explainable Machine Learning and Survivalmentioning
confidence: 99%
See 2 more Smart Citations
“…Shapley values explain the extent to which each variable affects the model output relative to the baseline average. We used SurvSHAP(t) ( 26 ), which is capable of providing model explanations in the form of survival function rather than a single point or aggregation ( 27 ), to make time-dependent explanations for our models.…”
Section: Methodsmentioning
confidence: 99%