2021
DOI: 10.48550/arxiv.2106.02566
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

BR-NPA: A Non-Parametric High-Resolution Attention Model to improve the Interpretability of Attention

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
1
1

Relationship

1
1

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 0 publications
0
1
0
Order By: Relevance
“…The last attention model included in this study is called Bilinear Representative Non-Parametric Attention (BR-NPA) [13] and proposes to generate attention maps to guide the model spatially without any dedicated parameter, contrary to the models mentioned above which feature either a convolution attention module or parametric prototypes.…”
Section: Saliency Map Generation Approachesmentioning
confidence: 99%
“…The last attention model included in this study is called Bilinear Representative Non-Parametric Attention (BR-NPA) [13] and proposes to generate attention maps to guide the model spatially without any dedicated parameter, contrary to the models mentioned above which feature either a convolution attention module or parametric prototypes.…”
Section: Saliency Map Generation Approachesmentioning
confidence: 99%