2023 IEEE/CVF Winter Conference on Applications of Computer Vision (WACV) 2023
DOI: 10.1109/wacv56688.2023.00147
|View full text |Cite
|
Sign up to set email alerts
|

A Protocol for Evaluating Model Interpretation Methods from Visual Explanations

Abstract: With the continuous development of Convolutional Neural Networks (CNNs), there is an increasing requirement towards the understanding of the representations they internally encode. The task of studying such encoded representations is referred to as model interpretation. Efforts along this direction, despite being proved efficient, stand with two weaknesses. First, there is low semanticity on the feedback they provide which leads toward subjective visualizations. Second, there is no unified protocol for the qua… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 65 publications
(355 reference statements)
0
1
0
Order By: Relevance
“…To bridge this gap, Nauta et al [206] introduced a technique to explain the prototypes derived from ProtoPNet-like models. Furthermore, the interpretability of these prototypes has recently been explored through human-centered analyses [210], [211], [212], [213].…”
Section: Prototypes As Decision Rulesmentioning
confidence: 99%
“…To bridge this gap, Nauta et al [206] introduced a technique to explain the prototypes derived from ProtoPNet-like models. Furthermore, the interpretability of these prototypes has recently been explored through human-centered analyses [210], [211], [212], [213].…”
Section: Prototypes As Decision Rulesmentioning
confidence: 99%