2021
DOI: 10.1109/access.2021.3135283
|View full text |Cite
|
Sign up to set email alerts
|

Fault Diagnosis of Oil-Immersed Power Transformer Based on Difference-Mutation Brain Storm Optimized Catboost Model

Abstract: To address the problem of low accuracy of power transformer fault diagnosis, this study proposed a transformer fault diagnosis method based on DBSO-CatBoost model. Based on data feature extraction, this method adopted DBSO (Difference-mutation Brain Storm Optimization) algorithm to optimize CatBoost model and diagnose faults. First, for data preprocessing, the ratio method was introduced to add features to the original data, the SHAP (Shapley Additive Explanations) method was applied for feature extraction, an… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 17 publications
(9 citation statements)
references
References 41 publications
0
9
0
Order By: Relevance
“…In order to make the model more convincing, the GA-XGBoost diagnostic model proposed in Ref. 36 . and The PSO-BiLSTM diagnostic model proposed in Ref.…”
Section: Resultsmentioning
confidence: 99%
“…In order to make the model more convincing, the GA-XGBoost diagnostic model proposed in Ref. 36 . and The PSO-BiLSTM diagnostic model proposed in Ref.…”
Section: Resultsmentioning
confidence: 99%
“…For the iterations, too small a parameter can lead to underfitting, resulting in inadequate model resolution, while too large a parameter can lead to overfitting, resulting in a decrease in the generalization ability of the model. In addition, the choice of depth is also important, as a wrong choice can affect the learning ability and classification capability of the model [ 41 ]. Therefore, in this study, an optimization algorithm was chosen to optimize the above four parameters to improve the performance of the classification model.…”
Section: Resultsmentioning
confidence: 99%
“…XGBoost was chosen because of its new regularization method that resists overfitting, therefore making the model more robust [31,50]. CatBoosts improved generalization as well as its ability to capture high-order dependencies makes it a viable candidate as well [51]. Last but not least LightGBM is an attractive method to use as a classifier because of its capability in handling large-scale data.…”
Section: Discussionmentioning
confidence: 99%