2021
DOI: 10.1016/j.asoc.2021.107212
|View full text |Cite
|
Sign up to set email alerts
|

Balancing accuracy and diversity in ensemble learning using a two-phase artificial bee colony approach

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
2

Relationship

0
9

Authors

Journals

citations
Cited by 16 publications
(8 citation statements)
references
References 41 publications
0
8
0
Order By: Relevance
“…As well as these advances in the machine learning area, which link well into meta-heuristic approaches, there is also scope for research into the combination of meta-heuristics with different optimisation algorithm approaches, leading to an overall more effective algorithm. In terms of algorithms research, the success of ensemble methods within the machine learning area (Abuassba et al, 2021;Shiue et al, 2021) indicates the potential of combining algorithms and models, with their diverse strengths and weaknesses, in optimisation applications (Han et al, 2020;Tóth et al, 2020;Ye et al, 2021).…”
Section: Discussionmentioning
confidence: 99%
“…As well as these advances in the machine learning area, which link well into meta-heuristic approaches, there is also scope for research into the combination of meta-heuristics with different optimisation algorithm approaches, leading to an overall more effective algorithm. In terms of algorithms research, the success of ensemble methods within the machine learning area (Abuassba et al, 2021;Shiue et al, 2021) indicates the potential of combining algorithms and models, with their diverse strengths and weaknesses, in optimisation applications (Han et al, 2020;Tóth et al, 2020;Ye et al, 2021).…”
Section: Discussionmentioning
confidence: 99%
“…For example, the knn algorithm has good adaptability to unknown distribution data, rf has a better effect on feature loss and sample unbalanced data learning, and the deep learning model can better learn the nonlinear relationship between the data. Therefore, according to the characteristics of different algorithms, this paper selects seven commonly used algorithms as base learners, including knn, svm, rf, and gbdt used in the article [6], and also uses the naive bayes classifier (nbc) [9], linear discriminant analysis (lda) and adaptive enhancement (adaboost) [10]. The sample allocation ratio of training sets and the validation set is 5:5, and multiple sets of differentiated sample combinations are generated by cross-validation.…”
Section: Building Base Learner Selection Standardmentioning
confidence: 99%
“…Therefore, in the selective integration process, the main task is to determine the appropriate selection strategy, learner performance measures, and metrics to determine the base learner and the subset of base learners that perform well. This paper proposes a base learner subset combination determination method based on the genetic algorithm [10], and the algorithm accuracy vector acc, time cost matrix t, and diversity metric q obtained after the completion of the training of 7 base learners can be stored in the form of vector ri(i=1,2,..., 7), and finally form a matrix of r=[r1, r2,..., r7] representing the operating parameters of different base learners. Then the selection vector of the base learner is defined as sl= [1,0,...,1], where element 1 in sl indicates that the base learner is selected to construct an ensemble learning model, and 0 means that it is not selected.…”
Section: Base Learner Selection Based On Genetic Algorithmmentioning
confidence: 99%
“…After training classifiers on these subsets, the predictions are combined by majority voting. AdaBoost maintains a set of weights over the original training sets and adjusts these weights after each classifier is trained (Shiue et al, 2021). This article selects five representative techniques—over‐sampling, under‐sampling, cost‐sensitive learning, bagging, and AdaBoost—to balance the data.…”
Section: Framework For Extracting Product/service Improvement Ideasmentioning
confidence: 99%