2022
DOI: 10.48550/arxiv.2201.12740
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

FEDformer: Frequency Enhanced Decomposed Transformer for Long-term Series Forecasting

Abstract: Although Transformer-based methods have significantly improved state-of-the-art results for long-term series forecasting, they are not only computationally expensive but more importantly, are unable to capture the global view of time series (e.g. overall trend). To address these problems, we propose to combine Transformer with the seasonal-trend decomposition method, in which the decomposition method captures the global profile of time series while Transformers capture more detailed structures. To further enha… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
15
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(26 citation statements)
references
References 11 publications
0
15
0
Order By: Relevance
“…FEDformer [ 5 ], being a Transformer-based model, computes attention coefficients in the frequency domain in order to represent point-wise interactions. Currently, FEDformer is the best model for long-term prediction, regardless of processing efficiency.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…FEDformer [ 5 ], being a Transformer-based model, computes attention coefficients in the frequency domain in order to represent point-wise interactions. Currently, FEDformer is the best model for long-term prediction, regardless of processing efficiency.…”
Section: Resultsmentioning
confidence: 99%
“…Thus far, the enhanced Transformer-based models for long-term forecasting include FEDformer [ 5 ], Autoformer [ 22 ], Informer [ 23 ], Reformer [ 24 ], LST [ 25 ], LogTrans [ 26 ], etc. This kind of algorithm primarily improves the attention calculation method, thus increasing the accuracy and reducing the time complexity.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Although Transformer-based methods significantly improve the state-of-the-art in long-term series forecasting, they are not only computationally expensive but, more importantly, fail to capture a global view of the time series (the overall trend). To address these issues, FEDformer [ 45 ] combines Transformer with seasonal trend decomposition methods [ 46 ], where the seasonal trend decomposition method captures the global profile of the time series, while Transformer captures the more detailed structure. The FEDformer model combines Transformer and seasonal trend decomposition, and the main architecture uses an encoder-decoder structure with innovations.…”
Section: Methodsmentioning
confidence: 99%
“…In a quite long period of time, the model based on RNN has played an important role in development of time series forecasting. Recently, transformer-based models [6][7][8][9] have been proposed enormously, which applied self-attention mechanism to distill useful semantic information in time series. However, there exists a doubt that transformer-like structure is not suitable for the task of time series forecasting.…”
Section: Introductionmentioning
confidence: 99%