2020
DOI: 10.35940/ijrte.e4993.038620
|View full text |Cite
|
Sign up to set email alerts
|

Measuring Success of Heterogeneous Ensemble Filter Feature Selection Models

Noureldien A. Noureldien,
Einas A. Mohammed

Abstract: One problem in utilizing ensemble feature selection models is machine learning is the fact that there is no guarantee that an ensemble model will improve machine learning classification performance. This implies that different ensemble models have different success probability, i.e. have different probability in improving the performance of machine learning. This paper introduces the concept of success probability for heterogeneous ensemble models and stated the definitions, notations, and algorithms necessary… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2021
2021

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 15 publications
0
2
0
Order By: Relevance
“…Ensemble Feature Selection Techniques Recently researchers showed more attention to ensemble feature selection techniques by proposing frameworks that combined multiple feature selection methods and aggregate its several outcomes into a single one; in machine learning, this combination is called ensemble learning [3][2] [26]. However, the researchers assumed that using such a technique will provide more accurate and stable results than results produced by a single feature selection method as it generates and aggregates different perspectives about the relevant features [3] [32]. Compared to single-based learning, the authors in the literature emphasised that ensemble learning is a good tool for discovering hidden knowledge related to the important features.…”
Section: (Iii) Sample Weightingmentioning
confidence: 99%
See 1 more Smart Citation
“…Ensemble Feature Selection Techniques Recently researchers showed more attention to ensemble feature selection techniques by proposing frameworks that combined multiple feature selection methods and aggregate its several outcomes into a single one; in machine learning, this combination is called ensemble learning [3][2] [26]. However, the researchers assumed that using such a technique will provide more accurate and stable results than results produced by a single feature selection method as it generates and aggregates different perspectives about the relevant features [3] [32]. Compared to single-based learning, the authors in the literature emphasised that ensemble learning is a good tool for discovering hidden knowledge related to the important features.…”
Section: (Iii) Sample Weightingmentioning
confidence: 99%
“…However, the heterogeneous ensemble technique is a good approach for evaluating the individual-based selectors' strengths and weaknesses [36]. Recent studies using this approach to tackle the stability issue can be found in [32][36] [31].…”
Section: (I) Data Diversity (Homogeneous Approach)mentioning
confidence: 99%