2016
DOI: 10.1002/cjce.22568
|View full text |Cite
|
Sign up to set email alerts
|

Locality preserving based data regression and its application for soft sensor modelling

Abstract: A new local‐based data regression technique named locality preserving regression (LPR) is developed and applied for soft sensor modelling in the present study. By taking the local variation obtained by locality preserving projections into consideration, the regression algorithm LPR is employed to construct a soft sensor model and applied to industrial case. Furthermore, to deal with the time‐varying behaviour of the process variables, just‐in‐time learning is also integrated to regularly update the soft sensor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
6
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 7 publications
(6 citation statements)
references
References 49 publications
0
6
0
Order By: Relevance
“…[12] In our study, we set n as the minimum number of sequentially-ordered feature values whose cumulative value is greater than 99.99% of the sum for all feature values. The kernel width parameter B of the kernel function can be calculated according to Equations ( 23) and (24), where B ¼ 80 m in the KPCA, DKPCA, and LDKPCA models. Regarding the selection of principal component, the size of CPV is calculated according to Equation (19), and the number of principal components with a CPV larger than 90% of the overall eigenvalues is selected.…”
Section: Fault Detection Resultsmentioning
confidence: 99%
See 1 more Smart Citation
“…[12] In our study, we set n as the minimum number of sequentially-ordered feature values whose cumulative value is greater than 99.99% of the sum for all feature values. The kernel width parameter B of the kernel function can be calculated according to Equations ( 23) and (24), where B ¼ 80 m in the KPCA, DKPCA, and LDKPCA models. Regarding the selection of principal component, the size of CPV is calculated according to Equation (19), and the number of principal components with a CPV larger than 90% of the overall eigenvalues is selected.…”
Section: Fault Detection Resultsmentioning
confidence: 99%
“…However, the local structure information in industrial process data is extremely valuable for feature extraction and data mining. In recent years, dimensionality reduction methods based on manifold learning have been rapidly developed, and a large number of nonlinear algorithms, such as Laplacian eigenmaps (LE), [ 21 ] locally linear embedding (LLE), [ 22 ] isometric feature mapping (ISOMAP), [ 23 ] and local preserving projections (LPP), [ 24,25 ] have been proposed and proved to be powerful in process monitoring capability for detection and diagnosis. Among them, Jiang et al [ 26 ] proposed a semi‐supervised machine fault monitoring method based on LE, and Huang et al [ 27 ] applied the LE method to the fault diagnosis of a rolling bearing.…”
Section: Introductionmentioning
confidence: 99%
“…The complete list of variables is given in Table 1. 9,50 The TE process has one normal operating condition and 21 faulty operating conditions. 51 The information about these faults is summarized in Table 2.…”
Section: Te Processmentioning
confidence: 99%
“…Therefore, a large number of scholars have carried out a series of studies and applications on data-driven modelling methods. Among them, the most widely used linear method is a multiple statistical regression, such as Gaussian process regression (GPR) and partial least squares regression [17][18][19]. Because these methods are simple, they have strong practicability, but they are also prone to errors when dealing with complex data, especially data with impurities.…”
Section: Introductionmentioning
confidence: 99%