2022
DOI: 10.1016/j.asoc.2022.109092
|View full text |Cite
|
Sign up to set email alerts
|

Network self attention for forecasting time series

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
9
1

Relationship

0
10

Authors

Journals

citations
Cited by 29 publications
(6 citation statements)
references
References 27 publications
0
3
0
Order By: Relevance
“…The Self-Attention mechanism helped extract important information by changing the weights of different channels, addressing the significance of different channels and samples when feature extraction is involved. YuntongHu et al [24] proposed a new time series prediction model that learns similarity scores based on network Self-Attention. This approach improved prediction accuracy and robustness.…”
Section: Proposed Methods a Mgts-attentionmentioning
confidence: 99%
“…The Self-Attention mechanism helped extract important information by changing the weights of different channels, addressing the significance of different channels and samples when feature extraction is involved. YuntongHu et al [24] proposed a new time series prediction model that learns similarity scores based on network Self-Attention. This approach improved prediction accuracy and robustness.…”
Section: Proposed Methods a Mgts-attentionmentioning
confidence: 99%
“…Inspired by the human brain's attention mechanism, the attention mechanism selectively weighs critical information from a large amount of data to enhance neural network efficiency. This mechanism has found extensive applications in various domains, including sentiment analysis [22,23], image segmentation [24,25], intelligent recommendation [26,27] and time series forecasting [28][29][30]. The self-attention mechanism, a prominent component of the transformer model developed by the Google team [31], is a variant of the attention mechanism.…”
Section: The Attention Mechanismmentioning
confidence: 99%
“…MHA is based on the self-attention mechanism (self attention) [40], which is a method for obtaining contextual information by calculating the correlations between different positions in an input sequence. Since there are many different forms of correlation and many different definitions, it is sometimes not possible to focus on just one form of correlation, but rather on multiple forms of correlation.…”
Section: Multihead Attentionmentioning
confidence: 99%