2015
DOI: 10.1016/j.neucom.2014.07.063
|View full text |Cite
|
Sign up to set email alerts
|

Dynamic ensemble pruning based on multi-label classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
3
0

Year Published

2015
2015
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 30 publications
(6 citation statements)
references
References 30 publications
(57 reference statements)
0
3
0
Order By: Relevance
“…In the past years, an additional intermediate step, consisting in selecting only some models before combining them, has been studied in depth [9]. The goal of this process, named as ensemble pruning [10,11,12] or ensemble selection [13,14,15], depending on the author, is to improve ensembles' accuracy and complexity by just using some models instead of all [16]. En-semble model selection can be further divided into two phases: 1) to define a function or criterion for evaluating/ranking the models and 2) to use a search algorithm to find the best group of models.…”
Section: Introductionmentioning
confidence: 99%
“…In the past years, an additional intermediate step, consisting in selecting only some models before combining them, has been studied in depth [9]. The goal of this process, named as ensemble pruning [10,11,12] or ensemble selection [13,14,15], depending on the author, is to improve ensembles' accuracy and complexity by just using some models instead of all [16]. En-semble model selection can be further divided into two phases: 1) to define a function or criterion for evaluating/ranking the models and 2) to use a search algorithm to find the best group of models.…”
Section: Introductionmentioning
confidence: 99%
“…Zhou et al 32 considered the interactive functions between features and designed an online streaming feature selection based on feature interaction. Multi‐label learning studies the problem of each object containing multiple categories simultaneously 33–35 . For multi‐label data, Liu et al 36 combined online group selection and online inter‐group selection, and designed a criterion in group selection to select feature groups that are important to the label set.…”
Section: Introductionmentioning
confidence: 99%
“…Multi-label learning studies the problem of each object containing multiple categories simultaneously. [33][34][35] For multi-label data, Liu et al 36 combined online group selection and online inter-group selection, and designed a criterion in group selection to select feature groups that are important to the label set. You et al 37 developed a new online multi-label streaming feature selection scheme that considers label correlation.…”
Section: Introductionmentioning
confidence: 99%
“…A dynamic ensemble selection algorithm is proposed based on group method of data handling, and it has strong noise‐immunity ability 14 . Besides, some methods are presented from the perspectives of meta‐learning and multi‐label classification, respectively 17–19 . Recently, dynamic ensemble selection are used in quantification tasks, 20 imbalance learning, 21 and one‐versus‐one scheme of multi‐class classification problems 22 .…”
Section: Introductionmentioning
confidence: 99%