2022
DOI: 10.1609/aaai.v36i8.20881
|View full text |Cite
|
Sign up to set email alerts
|

TS2Vec: Towards Universal Representation of Time Series

Abstract: This paper presents TS2Vec, a universal framework for learning representations of time series in an arbitrary semantic level. Unlike existing methods, TS2Vec performs contrastive learning in a hierarchical way over augmented context views, which enables a robust contextual representation for each timestamp. Furthermore, to obtain the representation of an arbitrary sub-sequence in the time series, we can apply a simple aggregation over the representations of corresponding timestamps. We conduct extensive experi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
112
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3
3
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 239 publications
(164 citation statements)
references
References 25 publications
0
112
0
Order By: Relevance
“…We compare Conformer to other baselines in terms of MSE and MAE under the multivariate time-series forecasting setting, and the results are reported in Table II [16] Autoformer [13] Informer [15] Reformer [12] LSTNet [1] GRU [ [13] Informer [15] Reformer [12] LogTrans [14] LSTNet [1] GRU [21] TS2VEC [49]…”
Section: B Prediction Results Of Multivariate Lttfmentioning
confidence: 99%
“…We compare Conformer to other baselines in terms of MSE and MAE under the multivariate time-series forecasting setting, and the results are reported in Table II [16] Autoformer [13] Informer [15] Reformer [12] LSTNet [1] GRU [ [13] Informer [15] Reformer [12] LogTrans [14] LSTNet [1] GRU [21] TS2VEC [49]…”
Section: B Prediction Results Of Multivariate Lttfmentioning
confidence: 99%
“…This property is particularly valuable because clinical data is often massive and unlabeled, and qualified clinicians rarely have the chance to withdraw from clinical work. With the advent of unsupervised learning, a new paradigm of pre-training has also emerged and has been applied in EEG research (Kostas et al, 2021 ; Yue et al, 2022 ; Zhang et al, 2022 ). This new paradigm inspired us to see if pre-trained models trained on large-scale EEG datasets can be used to generate SEEG signals after a fine-tuning stage.…”
Section: Discussionmentioning
confidence: 99%
“…Time series representation is a challenging problem, and, as a consequence, the characteristics extracted from the original measurements, defined in this work as the baseline, can be suboptimal. To overcome this issue, we investigate the performance of TS2Vec (Yue et al 2022) for this task. Specifically, we explore two Transfer Learning (TF) approaches to train TS2Vec models: i) using the GesturePebbleZ2 and the ECG200 datasets from the UCR time series classification archive (Dau et al 2019) to generate a time series representation of the individual raw triaxial accelerometer measurements and the normalised heart rate information, respectively (results encoded as TF in our experiments); and ii) using the corresponding modalities and analogous representations of the harAGE dataset (Mallol-Ragolta et al 2021) to generate time series representations of the characteristics extracted from the original measurements (results encoded as harAGE in our experiments).…”
Section: Data Preparationmentioning
confidence: 99%
“…Herein, we investigate the performance of binary end-to-end models targeting the detection of workers' perceived fatigue levels mono-and multimodally. To extract embedded representations from the available modalities, we explore the use of TS2Vec, a universal framework for learning time series representations (Yue et al 2022). The current study is part of a broader system which aims to provide personalised recommendations that support the employment, safety, and health of aging workers in occupational contexts (Mallol-Ragolta et al 2022).…”
Section: Introductionmentioning
confidence: 99%