2017
DOI: 10.1016/j.procs.2017.09.066
|View full text |Cite
|
Sign up to set email alerts
|

Time Series Classification using Deep Learning for Process Planning: A Case from the Process Industry

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
40
0
3

Year Published

2018
2018
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 92 publications
(43 citation statements)
references
References 16 publications
0
40
0
3
Order By: Relevance
“…• Quality inspection [6,51] • Fault diagnostic (detection, identification, estimation of magnitudes) •…”
Section: Deep Learningmentioning
confidence: 99%
See 2 more Smart Citations
“…• Quality inspection [6,51] • Fault diagnostic (detection, identification, estimation of magnitudes) •…”
Section: Deep Learningmentioning
confidence: 99%
“…Predictive analytics for defect prognosis [52,53] • Condition monitoring [54,55] • Service or operation planning [56,57] For example, Nijat et al [6] implemented stacked LSTM to extract features from time series data, which switches the intensive hand-crafted feature extraction way to a more automatic and intelligent way. Their network can also be used in process planning, monitoring the semi-finished products quality and determining the next process step.…”
Section: Deep Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…Faced with the correlated relationship in a long time-step span, the standard RNNs structure is awkward with regards to the gradient problem. There are some important branches of RNNs that appear to solve the long-term problem, such as LSTM [5,20,21] and Gated Recurrent Unit (GRU) [32] structures. However, these redesigned complex structures require specific training algorithms.…”
Section: Gradient Problemmentioning
confidence: 99%
“…Actually, the original process is very brittle because it has little feasible access to further information over a long period of time. In recent years, some intricate techniques have led to impressive results, such as gradient clipping [18], modifying activation function [19], long short-term memory (LSTM) [20,21], and identity initialization [22]. However, few research efforts have attempted flexible RNNs designs in the area of time series.…”
Section: Introductionmentioning
confidence: 99%