2019
DOI: 10.9781/ijimai.2018.06.005
|View full text |Cite
|
Sign up to set email alerts
|

A Diversity-Accuracy Measure for Homogenous Ensemble Selection

Abstract: Several selection methods in the literature are essentially based on an evaluation function that determines whether a model M contributes positively to boost the performances of the whole ensemble. In this paper, we propose a method called DIversity and ACcuracy for Ensemble Selection (DIACES) using an evaluation function based on both diversity and accuracy. The method is applied on homogenous ensembles composed of C4.5 decision trees and based on a hill climbing strategy. This allows selecting ensembles with… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
4
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 29 publications
0
4
0
Order By: Relevance
“…In this study, the base learner of the ensemble algorithms was the regression tree model, and thus, the ensemble was homogeneous. Since the diversity of the base learners has a great effect on the performance of an ensemble algorithm, heterogeneous ensemble algorithms (e.g., stacking) that integrate different base learners could further improve the accuracy of estimated AGB, and this technique will be considered in future studies [104][105][106].…”
Section: Discussionmentioning
confidence: 99%
“…In this study, the base learner of the ensemble algorithms was the regression tree model, and thus, the ensemble was homogeneous. Since the diversity of the base learners has a great effect on the performance of an ensemble algorithm, heterogeneous ensemble algorithms (e.g., stacking) that integrate different base learners could further improve the accuracy of estimated AGB, and this technique will be considered in future studies [104][105][106].…”
Section: Discussionmentioning
confidence: 99%
“…The RFR method uses an ensemble of decision trees, which usually vote or are averaged to obtain the final result (Zouggar and Adla, 2019;Rastgou et al, 2020). RFR does not easily fall into overfitting and is more robust than other methods in terms of noise due to the introduction of randomness.…”
Section: Rfr Methods For Estimating Winter Wheat Yieldmentioning
confidence: 99%
“…The decision tree is the most popular machine-learning based classification algorithm, which was first introduced by Quinlan in 1993 (Quinlan, 1993). It is a tree data structure in which each non-leaf/internal node symbolises a test on the variables, each external/leaf node contains the label for class, and each branch shows the result of the test (Kaur and Gosain, 2018;Zouggar and Adla, 2019). As shown in Figure 3, the decision tree divides the whole dataset into mutually exclusive spaces.…”
Section: Decision Treementioning
confidence: 99%