2021 International Joint Conference on Neural Networks (IJCNN) 2021
DOI: 10.1109/ijcnn52387.2021.9533426
|View full text |Cite
|
Sign up to set email alerts
|

Self-Supervised Pre-training for Time Series Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 13 publications
(9 citation statements)
references
References 6 publications
0
8
0
Order By: Relevance
“…Hence, pre-training models in CV [35,36,13] and NLP [9,14,37] are not directly applicable due to data modality mismatch, and the results leave room for improvement [30,38,39]. Shi et al [11] developed the only model to date that is explicitly designed for self-supervised time series pre-training. The model captures the local and global temporal pattern but it's not convincing why the designed pretext task can capture generalizable representations.…”
Section: Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…Hence, pre-training models in CV [35,36,13] and NLP [9,14,37] are not directly applicable due to data modality mismatch, and the results leave room for improvement [30,38,39]. Shi et al [11] developed the only model to date that is explicitly designed for self-supervised time series pre-training. The model captures the local and global temporal pattern but it's not convincing why the designed pretext task can capture generalizable representations.…”
Section: Related Workmentioning
confidence: 99%
“…We consider 8 baseline methods. This includes 6 state-of-the-art methods: TS-SD [11], TS2vec [46], CLOCS [40], Mixing-up [17], TS-TCC [47], and SimCLR [39]. The TS2Vec, TS-TCC, and SimCLR are designed for representation learning on a single dataset (not across datasets), we apply them fit our settings to make the results comparable.…”
Section: Implementation and Technical Detailsmentioning
confidence: 99%
See 3 more Smart Citations