2017 IEEE 33rd International Conference on Data Engineering (ICDE) 2017
DOI: 10.1109/icde.2017.142
|View full text |Cite
|
Sign up to set email alerts
|

Time Series Classification by Sequence Learning in All-Subsequence Space

Abstract: Abstract-Existing approaches to time series classification can be grouped into shape-based (numeric) and structure-based (symbolic). Shape-based techniques use the raw numeric time series with Euclidean or Dynamic Time Warping distance and a 1-Nearest Neighbor classifier. They are accurate, but computationally intensive. Structure-based methods discretize the raw data into symbolic representations, then extract features for classifiers. Recent symbolic methods have outperformed numeric ones regarding both accu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
34
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 37 publications
(34 citation statements)
references
References 26 publications
(76 reference statements)
0
34
0
Order By: Relevance
“…SAX can also be combined with a sliding window of length l, usually done to process longer time series ( Figure 2). Our previous study (Nguyen et al, 2017) also found that the sliding window technique has a positive impact on the classification accuracy, arguably because it can capture a finer description of the time series. The procedure to transform a time series to a SAX representation with a sliding window can be summarised in Algorithm 1.…”
Section: Symbolic Representation Of Time Seriesmentioning
confidence: 91%
See 4 more Smart Citations
“…SAX can also be combined with a sliding window of length l, usually done to process longer time series ( Figure 2). Our previous study (Nguyen et al, 2017) also found that the sliding window technique has a positive impact on the classification accuracy, arguably because it can capture a finer description of the time series. The procedure to transform a time series to a SAX representation with a sliding window can be summarised in Algorithm 1.…”
Section: Symbolic Representation Of Time Seriesmentioning
confidence: 91%
“…One solution for this issue is to search for the optimal parameters, either by a naive grid search or a more complex optimization algorithm (e.g., DIRECT as in SAX-VSM (Senin and Malinchik, 2013)). In our previous work, we mitigate this issue by introducing a new algorithm that can learn discriminative sub-words from a SAX word-based representation (Nguyen et al, 2017). However, while it was still competitive at the time of publication, that algorithm has fallen behind most recent state-of-the-art (e.g., WEASEL) in terms of accuracy.…”
Section: Sequence Learner With Multiple Symbolic Representations Of Tmentioning
confidence: 99%
See 3 more Smart Citations