Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
Proceedings of the 61st Annual Meeting of the Association for Computational Linguistics (Volume 2: Short Papers) 2023
DOI: 10.18653/v1/2023.acl-short.161
|View full text |Cite
|
Sign up to set email alerts
|

Randomized Positional Encodings Boost Length Generalization of Transformers

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(1 citation statement)
references
References 0 publications
0
0
0
Order By: Relevance
“…By introducing labeled tokens (class markers), the Transformer network can be directly used for image classification tasks. Anian Ruoss et al 18 proposed a new family of positional encodings to address this challenge.…”
Section: Related Workmentioning
confidence: 99%
“…By introducing labeled tokens (class markers), the Transformer network can be directly used for image classification tasks. Anian Ruoss et al 18 proposed a new family of positional encodings to address this challenge.…”
Section: Related Workmentioning
confidence: 99%