2017
DOI: 10.1002/cjce.22962
|View full text |Cite
|
Sign up to set email alerts
|

Online incipient fault diagnosis based on Kullback‐Leibler divergence and recursive principle component analysis

Abstract: Fault detection and isolation (FDI) methods based on the principal component analysis (PCA) model have achieved a large number of theoretical studies and applications, especially for complex and highly dimensional processes. However, the Hotelling's T2, that is the most common used statistical distance, can fail in detecting small shifts such as a sensor incipient fault with low fault‐to‐noise ratio (FNR). Although an incipient fault develops slowly, it cannot be ignored and is necessary to be detected early e… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1

Citation Types

0
6
0

Year Published

2018
2018
2021
2021

Publication Types

Select...
7
1

Relationship

1
7

Authors

Journals

citations
Cited by 20 publications
(6 citation statements)
references
References 28 publications
0
6
0
Order By: Relevance
“…The Kullback-Leibler Divergence (KLD), or the relative entropy, is a well-known probabilistic tool that has proven its worth in machine learning, neuroscience, pattern recognition [25], and anomaly detection [26,27]. KLD has already proved its efficiency for detecting incipient faults in several applications [28,29].…”
Section: Features Extraction For Fault Classificationmentioning
confidence: 99%
“…The Kullback-Leibler Divergence (KLD), or the relative entropy, is a well-known probabilistic tool that has proven its worth in machine learning, neuroscience, pattern recognition [25], and anomaly detection [26,27]. KLD has already proved its efficiency for detecting incipient faults in several applications [28,29].…”
Section: Features Extraction For Fault Classificationmentioning
confidence: 99%
“…The principal component analysis (PCA), neural network, and support vector machine (SVM) represent the typical data-driven tools for sensor fault diagnoses. PCA methods [7]- [11] can be used to reduce data dimension and extract feature vectors. The PCA-fused algorithms therefore can provide solutions to feature extraction and fault identification for sensors.…”
Section: Introductionmentioning
confidence: 99%
“…In addition, from the view of probability density function (PDF), the Kullback Leibler divergence-PCA method was implemented by Harmouche et al because of its high sensitivity for incipient faults. Chen et al and Chai et al also discussed the combination of KLD and PCA for incipient fault detection. Considering the extraction of early fault features from process noise, a two-step fault detection method was proposed by Ge et al, which applies wavelet analysis and optimal parity space analysis to distill the characteristics of small amplitude changes.…”
Section: Introductionmentioning
confidence: 99%