2021
DOI: 10.3389/fenrg.2021.755649
|View full text |Cite
|
Sign up to set email alerts
|

Incipient Chiller Fault Diagnosis Using an Optimized Least Squares Support Vector Machine With Gravitational Search Algorithm

Abstract: Operational faults in centrifugal chillers will lead to high energy consumption, poor indoor thermal comfort, and low operational safety, and thus it is of significance to detect and diagnose the anomalies timely and effectively, especially for those at their incipient stages. The least squares support vector machine (LSSVM) has been regarded as an effective algorithm for multiclass classification. One of the most difficult issues in LSSVM is parameter tuning. Therefore, this paper reports a development of a g… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(2 citation statements)
references
References 44 publications
0
2
0
Order By: Relevance
“…As depicted in Figure 11, the Relief-RFECV feature subset achieves higher accuracy than other feature sets in the 1DCNN-BIGRU, KNN, RF, and SVM fault models, thus confirming the superiority of the feature selection algorithm. In addition, the Relief-RFECV-SVM method was compared with other proposed fault diagnosis methods (Bayesian network merged distance rejection (DR-BN) [40], kernel principle component analysis-least squares support vector machine-gravitational search algorithm (KPCA-LSSVM-GSA) [41], random forest-global sensitivity analysis-cascade feature cleaning and supplement (RF-GSA-CFCS) [21], one-dimensional convolutional neural network-bidirectional gated recurrent unit (1DCNN-BIGRU) [42] and 1DCNN [18]). As shown in Figure 12 and Table 11, the proposed method achieved higher fault In addition, the Relief-RFECV-SVM method was compared with other proposed fault diagnosis methods (Bayesian network merged distance rejection (DR-BN) [40], kernel principle component analysis-least squares support vector machine-gravitational search algorithm (KPCA-LSSVM-GSA) [41], random forest-global sensitivity analysis-cascade feature cleaning and supplement (RF-GSA-CFCS) [21], one-dimensional convolutional neural network-bidirectional gated recurrent unit (1DCNN-BIGRU) [42] and 1DCNN [18]).…”
Section: Analysis Of Comparative Resultsmentioning
confidence: 99%
“…As depicted in Figure 11, the Relief-RFECV feature subset achieves higher accuracy than other feature sets in the 1DCNN-BIGRU, KNN, RF, and SVM fault models, thus confirming the superiority of the feature selection algorithm. In addition, the Relief-RFECV-SVM method was compared with other proposed fault diagnosis methods (Bayesian network merged distance rejection (DR-BN) [40], kernel principle component analysis-least squares support vector machine-gravitational search algorithm (KPCA-LSSVM-GSA) [41], random forest-global sensitivity analysis-cascade feature cleaning and supplement (RF-GSA-CFCS) [21], one-dimensional convolutional neural network-bidirectional gated recurrent unit (1DCNN-BIGRU) [42] and 1DCNN [18]). As shown in Figure 12 and Table 11, the proposed method achieved higher fault In addition, the Relief-RFECV-SVM method was compared with other proposed fault diagnosis methods (Bayesian network merged distance rejection (DR-BN) [40], kernel principle component analysis-least squares support vector machine-gravitational search algorithm (KPCA-LSSVM-GSA) [41], random forest-global sensitivity analysis-cascade feature cleaning and supplement (RF-GSA-CFCS) [21], one-dimensional convolutional neural network-bidirectional gated recurrent unit (1DCNN-BIGRU) [42] and 1DCNN [18]).…”
Section: Analysis Of Comparative Resultsmentioning
confidence: 99%
“…As for the algorithm selection in model construction, in recent years, based on the large amount of data generated by the operation of the power system and the development of artificial intelligence algorithms, traditional line loss calculation methods have gradually developed to intelligent processing algorithms represented by artificial neural networks (Zhang et al, 2018). The least squares support vector machine (LSSVM) can be used for both classification and prediction Chen et al, 2021;Xia et al, 2021). The support vector regression (SVR) improves the LSSVM, which significantly increases the speed of operation (Liu et al, 2019).…”
Section: Literature Reviewmentioning
confidence: 99%