Proceedings of the 23rd International Conference on Machine Learning - ICML '06 2006
DOI: 10.1145/1143844.1143974
|View full text |Cite
|
Sign up to set email alerts
|

Fast time series classification using numerosity reduction

Abstract: Many algorithms have been proposed for the problem of time series classification. However, it is clear that one-nearest-neighbor with Dynamic Time Warping (DTW) distance is exceptionally difficult to beat. This approach has one weakness, however; it is computationally too demanding for many realtime applications. One way to mitigate this problem is to speed up the DTW calculations. Nonetheless, there is a limit to how much this can help. In this work, we propose an additional technique, numerosity reduction, t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

5
280
0
9

Year Published

2011
2011
2020
2020

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 427 publications
(294 citation statements)
references
References 18 publications
5
280
0
9
Order By: Relevance
“…This approach reduces the size of the training set by selecting the best representative instances and use only them during classification of new instances. Due to its advantages, instance selection has been explored for time-series classification [20].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…This approach reduces the size of the training set by selecting the best representative instances and use only them during classification of new instances. Due to its advantages, instance selection has been explored for time-series classification [20].…”
Section: Introductionmentioning
confidence: 99%
“…For the above reasons, the proposed approach is denoted as Instance Selection based on Graph-coverage and Hubness for Time-series (INSIGHT). INSIGHT is evaluated experimentally with a collection of 37 publicly available time series classification data sets and is compared against FastAWARD [20], a state-of-the-art instance selection method for time series classification. We show that INSIGHT substantially outperforms FastAWARD both in terms of classification accuracy and execution time for performing the selection of instances.…”
Section: Introductionmentioning
confidence: 99%
“…Dynamic Time Warping Despite the major effort spent in building accurate time series classifiers, still the nearest neighbor classifier combined with a similarity technique called Dynamic Time Warping (DTW) was reported to produce more accurate results [15]. DTW overcomes the drawback of other methods because it can detect pattern variations, such as translations/shifts, size and deformations.…”
Section: Time Series Classificationmentioning
confidence: 99%
“…end for 9: end for 10: svmM odel ← svm.train(Strain, θ) 11: return svmM odel -DTW-NN: Characterized as a hard-to-beat baseline in time series classification, which has been reported to achieve hard-to-beat classification accuracy [15]. The relative performance of our method compared to DTW-NN will give hints whether a refined maximum-margin is competitive, or not.…”
Section: Algorithm 4 Learnmodelmentioning
confidence: 99%
“…In particular, the problem of time-series classification and prediction is attracting a lot of attention among researchers. One of the most successful and popular methods for classification and prediction are kernelbased methods suchlike as support vector machines (SVM) [26,12,35,25]. Despite their popularity, there seem to be only a handful of kernels designed for time-series.…”
Section: Introductionmentioning
confidence: 99%