2022
DOI: 10.1007/s10489-022-04301-w
|View full text |Cite
|
Sign up to set email alerts
|

Making attention mechanisms more robust and interpretable with virtual adversarial training

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(1 citation statement)
references
References 29 publications
0
1
0
Order By: Relevance
“…The calculation of the attention mechanism (Kitada et al 2023) is split into two steps: rstly, the attention distribution is computed over all inputs X, and then, based on the result, the weighted average of the input information is calculated.…”
Section: Attention Mechanismmentioning
confidence: 99%
“…The calculation of the attention mechanism (Kitada et al 2023) is split into two steps: rstly, the attention distribution is computed over all inputs X, and then, based on the result, the weighted average of the input information is calculated.…”
Section: Attention Mechanismmentioning
confidence: 99%