2021
DOI: 10.48550/arxiv.2102.05628
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

On the Regularity of Attention

James Vuckovic,
Aristide Baratin,
Remi Tachet des Combes

Abstract: Attention is a powerful component of modern neural networks across a wide variety of domains. In this paper, we seek to quantify the regularity (i.e. the amount of smoothness) of the attention operation. To accomplish this goal, we propose a new mathematical framework that uses measure theory and integral operators to model attention. We show that this framework is consistent with the usual definition, and that it captures the essential properties of attention. Then we use this framework to prove that, on comp… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 17 publications
0
1
0
Order By: Relevance
“…The self-attention mechanism (1) acts on sets {x i } i where the ordering of the elements does not matter. An equivalent way to model such invariant architectures is to consider them as acting on probability measures or point clouds of varying cardinality (De Bie et al, 2019;Vuckovic et al, 2021;Zweig and Bruna, 2021). Specifically, a collection of points (x i ) 1≤i≤n , where x i ∈ R d , can also be seen as a discrete measure on R d :…”
Section: Background and Related Workmentioning
confidence: 99%
“…The self-attention mechanism (1) acts on sets {x i } i where the ordering of the elements does not matter. An equivalent way to model such invariant architectures is to consider them as acting on probability measures or point clouds of varying cardinality (De Bie et al, 2019;Vuckovic et al, 2021;Zweig and Bruna, 2021). Specifically, a collection of points (x i ) 1≤i≤n , where x i ∈ R d , can also be seen as a discrete measure on R d :…”
Section: Background and Related Workmentioning
confidence: 99%