2022
DOI: 10.5281/zenodo.6375784
|View full text |Cite
|
Sign up to set email alerts
|

What to explain when explaining is difficult. An interdisciplinary primer on XAI and meaningful information in automated decision-making

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 0 publications
0
1
0
Order By: Relevance
“…XAI is thus not merely a technical challenge. There are always social components that affect how explanations are designed, e.g., for whom the explanations are made, what specific needs the users have, all the way to the question of what educational offerings are necessary to be able to understand the explanations given ( 37 , 38 ).…”
Section: Discussionmentioning
confidence: 99%
“…XAI is thus not merely a technical challenge. There are always social components that affect how explanations are designed, e.g., for whom the explanations are made, what specific needs the users have, all the way to the question of what educational offerings are necessary to be able to understand the explanations given ( 37 , 38 ).…”
Section: Discussionmentioning
confidence: 99%
“…This does not necessarily have to include complete disclosure of the entire technical functioning of the system. (Asghari et al, 2021) AI systems treat explanations as static products, calculated once and for all. In reality, an adequate explanation is more of a process than a product, requiring dynamic, iterative refinement among multiple agents.…”
Section: Commentsmentioning
confidence: 99%