2022
DOI: 10.1109/access.2022.3215532
|View full text |Cite
|
Sign up to set email alerts
|

RFE Based Feature Selection and KNNOR Based Data Balancing for Electricity Theft Detection Using BiLSTM-LogitBoost Stacking Ensemble Model

Abstract: Obtaining outstanding electricity theft detection (ETD) performance in the realm of advanced metering infrastructure (AMI) and smart grids (SGs) is quite difficult due to various issues. The issues include limited availability of theft data as compared to benign data, neglecting dimensionality reduction, usage of the standalone (single) electricity theft detectors, etc. These issues lead the classification techniques to low accuracy, minimum precision, low F1 score, and overfitting problems. For these reasons,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
9
0

Year Published

2022
2022
2025
2025

Publication Types

Select...
6

Relationship

2
4

Authors

Journals

citations
Cited by 12 publications
(9 citation statements)
references
References 67 publications
0
9
0
Order By: Relevance
“…To counter this, various methods are used to balance the data, ensuring that the models can accurately detect instances of theft: Oversampling the Minority Class: This method increases the number of theft cases in the dataset by either duplicating existing cases or creating new, synthetic examples. A popular technique for generating these synthetic examples is SMOTE (Synthetic Minority Over-sampling Technique), which creates new instances by blending between actual examples [ 33 ]. Undersampling the Majority Class: Conversely, this approach reduces the number of legitimate usage instances to equal the theft cases, though it may result in the loss of some valuable data [ 34 ].…”
Section: Sgcc Dataset and Explorationmentioning
confidence: 99%
“…To counter this, various methods are used to balance the data, ensuring that the models can accurately detect instances of theft: Oversampling the Minority Class: This method increases the number of theft cases in the dataset by either duplicating existing cases or creating new, synthetic examples. A popular technique for generating these synthetic examples is SMOTE (Synthetic Minority Over-sampling Technique), which creates new instances by blending between actual examples [ 33 ]. Undersampling the Majority Class: Conversely, this approach reduces the number of legitimate usage instances to equal the theft cases, though it may result in the loss of some valuable data [ 34 ].…”
Section: Sgcc Dataset and Explorationmentioning
confidence: 99%
“…Accuracy, precision, area under the receiver operating characteristic curve (AUC) score, F1 score, and recall are the performance metrics utilized in this paper to assess the performance of the proposed model. 19 These measures are the ones that are most frequently used in the ETD literature. [46][47][48][49] To calculate the aforementioned metrics, we need to calculate the confusion matrix.…”
Section: Performance Evaluation Measuresmentioning
confidence: 99%
“…When tuning of hyperparameters is done manually instead of automated, it affects the ML algorithm's performance in terms of computational overhead, precision, F1 score, and accuracy. 19 In this study, we develop a novel electricity theft detector called SSA-GCAE-CSLSTM that makes use of users' EC history to address issues with existing electricity theft classification techniques. This detector is based on gated recurrent unit (GRU), convolutional autoencoder (CAE), salp swarm algorithm (SSA) optimization, and cost-sensitive learning (class-weighting) and long short-term memory (CSLSTM).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…Among these, electricity theft is responsible for a remarkable amount of loss [2]. The illegal usage of energy in multiple ways is referred to as stealing electricity, also known as electricity theft [3]. The principal reason of energy theft is tapping, which is responsible for almost 80% of the total NTLs [1].…”
Section: Introductionmentioning
confidence: 99%