2022
DOI: 10.1007/978-3-031-14923-8_1
|View full text |Cite
|
Sign up to set email alerts
|

Using Case-Based Reasoning for Capturing Expert Knowledge on Explanation Methods

Abstract: Model-agnostic methods in (XAI) propose isolating the explanation system from the AI model architecture, typically Machine Learning or black-box models. Existing XAI libraries offer a good number of explanation methods, that are reusable for different domains and models, with different choices of parameters. However, it is not clear what would be a good explainer for a given situation, domain, AI model, and user preferences. The choice of a proper explanation method is a complex decision-making process itself… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 19 publications
(19 reference statements)
0
2
0
Order By: Relevance
“…Developing user profiles and providing tailored explanations to each type of user will facilitate understandability [6]. To customize explanations for each user, Darias et al [75] developed a method using Case-Based Reasoning to select the best explanation method for each situation based on the AI model, domain, and user preferences. Customized explana-tions enable users to learn more from the explanations and gain a better understanding of the CPS.…”
Section: Customizable Output For Different Usersmentioning
confidence: 99%
“…Developing user profiles and providing tailored explanations to each type of user will facilitate understandability [6]. To customize explanations for each user, Darias et al [75] developed a method using Case-Based Reasoning to select the best explanation method for each situation based on the AI model, domain, and user preferences. Customized explana-tions enable users to learn more from the explanations and gain a better understanding of the CPS.…”
Section: Customizable Output For Different Usersmentioning
confidence: 99%
“…These challenges can be overcome by using simple linear models that can provide accuracy with greater transparency and using visualization tools that can help users understand the model and the basis of the AI decisions. With the help of XAI methods such as Local interpretable Model-agnostic Explanations (LIME) that provides human understandable terminology, clinicians can understand how the decision-making process works in the model [7].…”
Section: Challenges and Considerationsmentioning
confidence: 99%
“…(3) Interpretation-oriented retrieval: Explaining CBR and justifying recommendations or solutions is often important, especially in the field of medical decision-making, in which explaining the cause and correctness of the search results can provide compelling support (McSherry 2005;Doyle et al 2004;Darias et al 2022).…”
Section: Case Retrievalmentioning
confidence: 99%