2019
DOI: 10.1007/s10618-019-00619-1
|View full text |Cite
|
Sign up to set email alerts
|

Deep learning for time series classification: a review

Abstract: Time Series Classification (TSC) is an important and challenging problem in data mining. With the increase of time series data availability, hundreds of TSC algorithms have been proposed. Among these methods, only a few have considered Deep Neural Networks (DNNs) to perform this task. This is surprising as deep learning has seen very successful applications in the last years. DNNs have indeed revolutionized the field of computer vision especially with the advent of novel deeper architectures such as Residual a… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

5
712
0
8

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 2,073 publications
(885 citation statements)
references
References 96 publications
5
712
0
8
Order By: Relevance
“…The choice of using convolutional neural networks (CNNs) within the Keras framework (Chollet et al, 2015) for doing the parameter estimation was made out of convenience. Other machine learning techniques, see Ismail Fawaz et al (2018) for a recent review, could likely have done as good, or even better. Further, the architecture of the CNNs was not optimised in any systematic way.…”
Section: Discussionmentioning
confidence: 99%
“…The choice of using convolutional neural networks (CNNs) within the Keras framework (Chollet et al, 2015) for doing the parameter estimation was made out of convenience. Other machine learning techniques, see Ismail Fawaz et al (2018) for a recent review, could likely have done as good, or even better. Further, the architecture of the CNNs was not optimised in any systematic way.…”
Section: Discussionmentioning
confidence: 99%
“…Two types of pooling have received most of the attention in the literature about image analysis: 1) the local max-pooling [72], and 2) the global average pooling [73]. For time series, global average pooling seems to have been more successful [9,37]. We want to see here if these previous results can be generalized for time series classification.…”
Section: Are Local and Global Temporal Pooling Layers Important?mentioning
confidence: 91%
“…Table 4 displays the OA values as a function of reach for TempCNN. We study five size of filters f = {3, 5,9,17, 33} corresponding to a reach of 2, 4, 8, 16, and 32 days, respectively. Table 4 shows the maximum OA is reached for a reach of 8 days, with a similiar OA for 4 days.…”
Section: Influence Of the Filter Sizementioning
confidence: 99%
See 1 more Smart Citation
“…The basic units of calculation in ANNs are called neurons, which are connected via weighted inputs that resemble synapses. These biologically inspired models have the proven capability of learning from data, which has accelerated the data-driven discovery revolution over the last decade [25][26][27][28][29].…”
Section: B Neural Network Modelmentioning
confidence: 99%