Abstract:In this note we examine the autoregressive generalization of the FNet algorithm, in which selfattention layers from the standard Transformer architecture are substituted with a trivial sparse-uniform sampling procedure based on Fourier transforms. Using the Wikitext-103 benchmark, we demonstrate that FNetAR retains state-of-the-art performance (25.8 ppl) on the task of causal language modeling compared to a Transformer-XL baseline (24.2 ppl) with only half the number self-attention layers, thus providing furth… Show more
Set email alert for when this publication receives citations?
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.