2011
DOI: 10.1007/978-3-642-21222-2_31
|View full text |Cite
|
Sign up to set email alerts
|

Fusion of Similarity Measures for Time Series Classification

Abstract: Abstract. Time series classification, due to its applications in various domains, is one of the most important data-driven decision tasks of artificial intelligence. Recent results show that the simple nearest neighbor method with an appropriate distance measure performs surprisingly well, outperforming many state-of-the art methods. This suggests that the choice of distance measure is crucial for time series classification. In this paper we shortly review the most important distance measures of the literature… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
19
0

Year Published

2012
2012
2018
2018

Publication Types

Select...
5
2

Relationship

1
6

Authors

Journals

citations
Cited by 23 publications
(19 citation statements)
references
References 17 publications
0
19
0
Order By: Relevance
“…We calculated the edit distance between the trajectories and obtained the 45 values each two trajectories. To prove the validity of the edit distance method, the DTW (dynamic time warping) [40] method was chosen for comparison. The key feature of DTW is that it allows siftings and elongations while it compares two time-series, which is a common method in time series data [41], and can be used for trajectory data.…”
Section: Distance Measurement Of Taxi Trajectory Datamentioning
confidence: 99%
“…We calculated the edit distance between the trajectories and obtained the 45 values each two trajectories. To prove the validity of the edit distance method, the DTW (dynamic time warping) [40] method was chosen for comparison. The key feature of DTW is that it allows siftings and elongations while it compares two time-series, which is a common method in time series data [41], and can be used for trajectory data.…”
Section: Distance Measurement Of Taxi Trajectory Datamentioning
confidence: 99%
“…The algorithms are general, in the sort of sense that they can be applied to any kind of data, provided that an appropriate distance measure between the instances of the dataset is available. In case of EEGdata, we use multivariate DTW as distance measure as described in [36]. As in our case instances are EEG-signals, we will mostly use the term EEG-signal instead of instance while describing hubness-aware classifiers.…”
Section: Hubness-aware Classifiersmentioning
confidence: 99%
“…We compared these algorithms to k-NN. Both in case of k-NN and the hubnessaware classifiers, we used multivariate DTW as distance measure as described in [36]. We set k = 10 for the hubness-aware classifiers.…”
Section: Experimental Settingsmentioning
confidence: 99%
“…Then, we calculate the distance of the remaining (i.e., non-selected) EEG signals from the selected ones using DTW. For the description of how to calculate DTW on multivariate time series we refer to [6]. Subsequently, the distances are used as real-valued features: the distance of the signal x from the first selected signal will be the first feature of x, the distance of x from the second selected signal will be the second feature of x, etc.…”
Section: Projecting Eeg Signals Into a Vector Spacementioning
confidence: 99%
“…As EEG signals are time series, we consider the classification of EEG signals as a time-series classification problem, for which the k nearest-neighbor (k-NN) method using dynamic time warping (DTW) as distance measure was reported to be competitive, if not superior, to many state-of-the-art time-series classifiers, such as neural networks or hidden Markov models, see e.g. [6], [8] and the references therein. Furthermore, in their recent work, Chen et al [8] gave theoretical guarantees for the performance of nearest neighbor-like time-series classifiers.…”
Section: Introductionmentioning
confidence: 99%