2021
DOI: 10.1049/gtd2.12073
|View full text |Cite
|
Sign up to set email alerts
|

Efficient CNN‐XGBoost technique for classification of power transformer internal faults against various abnormal conditions

Abstract: To increase the classification accuracy of a protection scheme for power transformer, an effective convolution neural network (CNN) extreme gradient boosting (XGBoost) combination is proposed in this work. Data generated from various test cases are fed to one‐dimensional CNN for high‐level feature extraction. After that, an efficient classifier tool XGBoost is used to properly discriminate different transformer internal faults against outside abnormalities. A portion of an Indian power system is considered and… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
9
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
4

Relationship

0
9

Authors

Journals

citations
Cited by 29 publications
(9 citation statements)
references
References 32 publications
0
9
0
Order By: Relevance
“…Target Reference Method Advantage [91] Generation and identification model with GAN Generate samples to train the identification model [92] Identification and correction method for drilling data Correct the abnormal drilling data [93] Extreme gradient boosting framework…”
Section: Table 4 (Continued)mentioning
confidence: 99%
See 1 more Smart Citation
“…Target Reference Method Advantage [91] Generation and identification model with GAN Generate samples to train the identification model [92] Identification and correction method for drilling data Correct the abnormal drilling data [93] Extreme gradient boosting framework…”
Section: Table 4 (Continued)mentioning
confidence: 99%
“…To improve the classification accuracy of the scheme for protecting the power transformer, Raichura et al [93] designed the extreme gradient boosting framework to distinguish the outside faults and inner anomalies. Moreover, a convolutional neural network (CNN) was employed to classify the faults.…”
Section: Generate the Substation Samplesmentioning
confidence: 99%
“…XGBoost is a machine learning algorithm based on Boosting integration learning. It takes a series of tree models as basic classifiers and recombines them with different weights to synthesise a strong classifier, which can realize the complementary advantages of each classifier [22,23]. Compared with Gradient Boosting Decision Tree (GBDT), the XGBoost algorithm performs second-order Taylor expansions for the objective function and adds regular term, which effectively reduces the risk of overfitting and improves the accuracy and operation speed of the algorithm.…”
Section: Principle Of Xgboostmentioning
confidence: 99%
“…The main question on the threshold is about the process of choosing an appropriate value; most of the researchers have selected it according to the worst‐case study. In this research, we use no threshold value, and the task of classification and detection is fulfilled by comparing the similarities between query data and training dataset. Other research projects such as [37–39] have used ML for fault detection and classification too, but the main contribution of this paper and the proposed algorithm is a new insight into integrated ML system architecture where a combination of two or more different ML models through a new system architecture can provide salient capabilities as it can be seen in [40]. We consider different ML models for detection and classification and connect them with an appropriate property which results in ensembled learning and a more accurate protection model.…”
Section: Simulation Studies and Analysismentioning
confidence: 99%