2022
DOI: 10.1155/2022/7111034
|View full text |Cite|
|
Sign up to set email alerts
|

An Information Entropy Embedding Feature Selection Based on Genetic Algorithm

Abstract: Feature selection is of vital importance to reduce information redundancy and deal with the invalidation of basic classification approaches for massive dataset and too many features. In order to improve the classification accuracy and decrease time complexity, an algorithm with intelligent optimization genetic algorithm and weight distribution based on information entropy is proposed, called EEGA. Information entropy of features is defined as the population labels in GA rather than the direct iteration of indi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
2
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 11 publications
0
2
0
Order By: Relevance
“… 66 The larger the entropy of an attribute, the less information it contains (the more costly the process of entropy decreases) because its appointed weight is lower. To begin calculating the entropy of a feature, the uncertainty function c is described as 67 : where p denotes the related probability. So, the quantification of information entropy E(C) could be noticed by determining the probability of the uncertainty function: where C denotes the collection of all possible events.…”
Section: Methodsmentioning
confidence: 99%
“… 66 The larger the entropy of an attribute, the less information it contains (the more costly the process of entropy decreases) because its appointed weight is lower. To begin calculating the entropy of a feature, the uncertainty function c is described as 67 : where p denotes the related probability. So, the quantification of information entropy E(C) could be noticed by determining the probability of the uncertainty function: where C denotes the collection of all possible events.…”
Section: Methodsmentioning
confidence: 99%
“…Security and Communication Networks has retracted the article titled "An Information Entropy Embedding Feature Selection Based on Genetic Algorithm" [1] due to concerns that the peer review process has been compromised. Following an investigation conducted by the Hindawi Research Integrity team [2], signifcant concerns were identifed with the peer reviewers assigned to this article; the investigation has concluded that the peer review process was compromised.…”
mentioning
confidence: 99%