2024
DOI: 10.1111/bjop.12714
|View full text |Cite
|
Sign up to set email alerts
|

Explanation strategies in humans versus current explainable artificial intelligence: Insights from image classification

Ruoxi Qi,
Yueyuan Zheng,
Yi Yang
et al.

Abstract: Explainable AI (XAI) methods provide explanations of AI models, but our understanding of how they compare with human explanations remains limited. Here, we examined human participants' attention strategies when classifying images and when explaining how they classified the images through eye‐tracking and compared their attention strategies with saliency‐based explanations from current XAI methods. We found that humans adopted more explorative attention strategies for the explanation task than the classificatio… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 75 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?