2020
DOI: 10.1155/2020/8860044
|View full text |Cite
|
Sign up to set email alerts
|

Statistical Analysis of the Performance of Rank Fusion Methods Applied to a Homogeneous Ensemble Feature Ranking

Abstract: The feature ranking as a subcategory of the feature selection is an essential preprocessing technique that ranks all features of a dataset such that many important features denote a lot of information. The ensemble learning has two advantages. First, it has been based on the assumption that combining different model’s output can lead to a better outcome than the output of any individual models. Second, scalability is an intrinsic characteristic that is so crucial in coping with a large scale dataset. In this p… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
2
1

Relationship

1
2

Authors

Journals

citations
Cited by 3 publications
(6 citation statements)
references
References 67 publications
0
6
0
Order By: Relevance
“…The DEIM framework can utilize an arbitrary feature ranking algorithm, such as Fisher, and Gini Index [21], as the base learner in phase three. Nevertheless, algorithms that belong to the filter category would be better than others because they have lower computational costs and more generality [22].…”
Section: Base Feature Ranking Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…The DEIM framework can utilize an arbitrary feature ranking algorithm, such as Fisher, and Gini Index [21], as the base learner in phase three. Nevertheless, algorithms that belong to the filter category would be better than others because they have lower computational costs and more generality [22].…”
Section: Base Feature Ranking Methodsmentioning
confidence: 99%
“…This capability leads to lower execution time than the traditional version, whereas the outcome was not destroyed. Moreover, the author's other work investigated the various rank combination methods in an ensemble feature selection approach [21]. Their proposed method focused on scalability and combining feature rankings, whereas coping with imbalanced datasets was not investigated.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…The noticeable point is that computing the individual entropy and joint entropy would be feasible by generating the joint value histogram vector. Therefore, if the joint value histogram of two features X i and X j is represented as a square matrix F ∈ N b×b such that f i,j is equal to the frequency of joint values (x i , x j ), the joint and individual entropies will be computed based on equations (11) to (13). Therefore, acquiring the joint value histogram of two features, the MI measure between them can be computed in a single pass.…”
Section: A Similarity Matrix In Theorymentioning
confidence: 99%
“…According to dependency on a classification model, FS algorithms can be categorized into three groups: wrapper methods [8], [9], embedded methods [10], and filter methods [11]. Filter methods only rest on data's statistical properties, and since they are independent of any learning model, they prevent incurring a high computational cost and provide more generality than two other categories [12].…”
Section: Introductionmentioning
confidence: 99%