2020 International Joint Conference on Neural Networks (IJCNN) 2020
DOI: 10.1109/ijcnn48605.2020.9207178
|View full text |Cite
|
Sign up to set email alerts
|

Gated Res2Net for Multivariate Time Series Analysis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 7 publications
(10 citation statements)
references
References 29 publications
0
10
0
Order By: Relevance
“…• Res2Net [53]: this is a CNN backbone that uses group convolution and hierarchical residual-like connections between convolutional filter groups to achieve multi-scale receptive fields. • GRes2Net [31]: this work incorporates gates in Res2Net, where the gates' values are calculated based on a different method from ours-it additionally takes into account the original feature map before it is divided into groups when calculating gates' values. • Res2Net+SE: this work combines Res2Net with a Squeeze-and-Excitation Block (SE) [52] to leverage the effectiveness of attention modules.…”
Section: Baseline Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…• Res2Net [53]: this is a CNN backbone that uses group convolution and hierarchical residual-like connections between convolutional filter groups to achieve multi-scale receptive fields. • GRes2Net [31]: this work incorporates gates in Res2Net, where the gates' values are calculated based on a different method from ours-it additionally takes into account the original feature map before it is divided into groups when calculating gates' values. • Res2Net+SE: this work combines Res2Net with a Squeeze-and-Excitation Block (SE) [52] to leverage the effectiveness of attention modules.…”
Section: Baseline Methodsmentioning
confidence: 99%
“…Commonly used DL architectures include Recurrent Neural Net-works (RNNs), Gated Recurrent Unit (GRU) [18], Convolutional Neural Networks (CNNs), Long Short-Term Memory (LSTM) [17], and Transformer [16]. And recent studies heavily rely on CNNs to overcome the efficiency and scalability issues with recurrent models (e.g., RNN, LSTM, and GRU) [30][31][32].…”
Section: Multivariate Time Series Classificationmentioning
confidence: 99%
See 2 more Smart Citations
“…In the paper [14], they have developed a unified architecture of Res2Net and a gated mechanism called GRes2Net for analysing time series data with multiple time-dependent variables. The gated mechanism is used to control the flow of feature maps from the prior step to the next input.…”
Section: Literature Reviewmentioning
confidence: 99%