2021
DOI: 10.1007/s10489-020-02106-3
|View full text |Cite
|
Sign up to set email alerts
|

Cost-sensitive probability for weighted voting in an ensemble model for multi-class classification problems

Abstract: Ensemble learning is an algorithm that utilizes various types of classification models. This algorithm can enhance the prediction efficiency of component models. However, the efficiency of combining models typically depends on the diversity and accuracy of the predicted results of ensemble models. However, the problem of multi-class data is still encountered. In the proposed approach, cost-sensitive learning was implemented to evaluate the prediction accuracy for each class, which was used to construct a cost-… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 12 publications
(8 citation statements)
references
References 79 publications
(112 reference statements)
0
6
0
Order By: Relevance
“…achieved through different methods. Rojarath and Songpan [119] addressed the issue of multi-class data. They proposed a cost-sensitivity matrix of the true positive (TP).…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…achieved through different methods. Rojarath and Songpan [119] addressed the issue of multi-class data. They proposed a cost-sensitivity matrix of the true positive (TP).…”
Section: Related Workmentioning
confidence: 99%
“…Kim et al [113] proposed a two-stage weighting method for voting in ensemble learning. Some works proposed different methods for voting in classification problems [114]- [119]. In the related works section details of voting works will be discussed.…”
Section: Introductionmentioning
confidence: 99%
“…The weights were obtained by the correct prediction rate of the classes from the true positive values. The basic classifier techniques [38] are combined as shown in equations (6) ( 7) and (8).…”
Section: Multi-objective Optimizationmentioning
confidence: 99%
“…Despite the numerous proposed weight assignment methods, finding the suitable weight configuration remains a challenging task. At present, the most common practice is to assign weights according to the prediction accuracy of the predictor [ 36 , 37 ]. However, when the prediction accuracy gap between the predictors is too large, this method cannot guarantee the integrated results better than the results of a single predictor.…”
Section: Introductionmentioning
confidence: 99%