2022
DOI: 10.1021/acsomega.2c01108
|View full text |Cite
|
Sign up to set email alerts
|

Nonlinear Dynamic Soft Sensor Development with a Supervised Hybrid CNN-LSTM Network for Industrial Processes

Abstract: A soft sensor is a key component when a real-time measurement is unavailable for industrial processes. Recently, soft sensor models based on deep-learning techniques have been successfully applied to complex industrial processes with nonlinear and dynamic characteristics. However, the conventional deep-learning-based methods cannot guarantee that the quality-relevant features are included in the hidden states when the modeling samples are limited. To address this issue, a supervised hybrid network based on a d… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
10
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 11 publications
(10 citation statements)
references
References 42 publications
(68 reference statements)
0
10
0
Order By: Relevance
“…LSTM is a special and popular RNN, which is mainly used to solve the problem of gradient vanishing and gradient explosion during long sequence training [ 63 ]. Compared with other neural networks, LSTM is better at processing data with sequence changes, such as speech signals [ 64 ]. In our study, spectral data were regarded as data of sequence changes, and the LSTM model as shown in Figure S14 was constructed (the basic unit of LSTM can be seen in Figure S15 [ 65 ]).…”
Section: Resultsmentioning
confidence: 99%
“…LSTM is a special and popular RNN, which is mainly used to solve the problem of gradient vanishing and gradient explosion during long sequence training [ 63 ]. Compared with other neural networks, LSTM is better at processing data with sequence changes, such as speech signals [ 64 ]. In our study, spectral data were regarded as data of sequence changes, and the LSTM model as shown in Figure S14 was constructed (the basic unit of LSTM can be seen in Figure S15 [ 65 ]).…”
Section: Resultsmentioning
confidence: 99%
“…CNN is a deep neural network with strong feature extraction ability. Its main characteristic is the use of a shared parameter filter to scan the previous feature graph, which can significantly reduce the size of the parameter space . The convolution layer extracts local features of input data through convolution operation.…”
Section: Proposed Methodologymentioning
confidence: 99%
“…32 Due to the problem of gradient vanishing and exploding associated with RNN, an advanced architecture of RNN called long-term memory (LSTM) was designed. 32,36,37 LSTM has good efficiency in predicting time series because of its unique advantage in solving gradient problems (vanishing and exploding). Also, the term (long-term memory) makes it able to deal with long sequences and predict time series more accurately with more parameters compared to RNN.…”
mentioning
confidence: 99%
“…Deep learning is a branch of artificial intelligence, and it is one of the most popular methods used in image processing, natural language processing, and others. , It has also been widely used in the field of analytical chemistry due to its algorithms that extract intrinsic information from data by using simple and non-linear units to transform and extract essential features. , The recurrent neural network (RNN) is one of the most important deep learning algorithms that are widely used in the field of time series processing besides its use in speech recognition and natural language processing . Due to the problem of gradient vanishing and exploding associated with RNN, an advanced architecture of RNN called long-term memory (LSTM) was designed. ,, LSTM has good efficiency in predicting time series because of its unique advantage in solving gradient problems (vanishing and exploding). Also, the term (long-term memory) makes it able to deal with long sequences and predict time series more accurately with more parameters compared to RNN. LSTM depends on the previous input, but in fact, the outputs also depend on the latter inputs, which increases the number of inputs available to the model.…”
mentioning
confidence: 99%
See 1 more Smart Citation