Our system is currently under heavy load due to increased usage. We're actively working on upgrades to improve performance. Thank you for your patience.
2019
DOI: 10.1007/s00521-019-04082-3
|View full text |Cite
|
Sign up to set email alerts
|

Ensemble feature selection for high-dimensional data: a stability analysis across multiple domains

Abstract: Selecting a subset of relevant features is crucial to the analysis of high-dimensional datasets coming from a number of application domains, such as biomedical data, document and image analysis. Since no single selection algorithm seems to be capable of ensuring optimal results in terms of both predictive performance and stability (i.e. robustness to changes in the input data), researchers have increasingly explored the effectiveness of ''ensemble'' approaches involving the combination of different selectors. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
153
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 162 publications
(153 citation statements)
references
References 64 publications
0
153
0
Order By: Relevance
“…the weak learners, which are trained and combined together in order to solve specific problems. Indeed, this strategy relies on the idea that such a combination of several single (weak) models, in particular if associated with a proper feature selection step [16], can lead to an improvement of the final accuracy.…”
Section: Ensemble Learningmentioning
confidence: 99%
“…the weak learners, which are trained and combined together in order to solve specific problems. Indeed, this strategy relies on the idea that such a combination of several single (weak) models, in particular if associated with a proper feature selection step [16], can lead to an improvement of the final accuracy.…”
Section: Ensemble Learningmentioning
confidence: 99%
“…Since then, many research work on ensemble feature selection is published. We can categorize most of the published work into four classes; (i) Work that focus on proofing that using ensemble filter feature selection models can improve the performance of machine learning rather than using single feature selection methods [ [18], and (iv) Research work that focuses on providing reviews and surveys for existing approaches and applications of ensemble filter feature selection models [19].…”
Section: Related Workmentioning
confidence: 99%
“…Experimental evaluations indicated that their proposed ensemble model is an efficient method, and it outperforms individual filter-based feature selection methods on sentiment classification. Similarly, an empirical exploration of the effectiveness of homogenous ensemble approach is studied [18]. To construct the model, a single feature selection algorithm is applied to several diversified datasets derived from the original set of records.…”
Section: Related Workmentioning
confidence: 99%
“…genes to reduce the number of features and dimensions. [10][11][12] A strength point of our work is that we consider ML as powerful advanced statistics tool doing heavy statistical analyses, that people themselves cannot do. As a result, we gave all the data corresponding to the WES as feature inputs to the ML at once and it returned almost perfect results quickly and precisely.…”
Section: Feature Selectionmentioning
confidence: 99%