2019
DOI: 10.1016/j.amc.2018.12.020
|View full text |Cite
|
Sign up to set email alerts
|

An instance-based learning recommendation algorithm of imbalance handling methods

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
18
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7

Relationship

0
7

Authors

Journals

citations
Cited by 24 publications
(25 citation statements)
references
References 18 publications
0
18
0
Order By: Relevance
“…However, the minority class usually is the class of interest and more important. Existing methods such as datalevel [43], algorithmic-level [44],. and cost-sensitive learning [45], [46] have been proposed to address this problem.…”
Section: Weighted Decision-making Tablementioning
confidence: 99%
“…However, the minority class usually is the class of interest and more important. Existing methods such as datalevel [43], algorithmic-level [44],. and cost-sensitive learning [45], [46] have been proposed to address this problem.…”
Section: Weighted Decision-making Tablementioning
confidence: 99%
“…These measures were improved and extended to ordinary classification problems in later studies [6]. This group of meta-features has shown its significance in algorithm recommendation systems [12], [67]. It adopt frequencies of itemsets with respect to the parity function to characterize a dataset [66].…”
Section: ) Structural-information-based Measuresmentioning
confidence: 99%
“…With the use of KNN as meta-learner, the addition of meta examples from new experimental results can quickly be integrated into the existing meta examples in the meta-knowledge database without the requirement of remodeling the relationship between meta-features and performance of candidate algorithms. This approach is used in majority of the studies for classifier selection, examples includes [39], [46], [51], [66], [67], [71]- [73]. However, the drawback of this approach is the selection of optimal value for the parameter K because the number of similar datasets vary for each problem in the meta-knowledge database [12].…”
Section: B Meta-learnermentioning
confidence: 99%
See 1 more Smart Citation
“…KNN), the algorithm memorizes the training data samples and generalizes to new observations by matching them to the learned data samples according to a similarity measure. The algorithm stores instances of training data instead of constructing a general internal model [7,12]. In modelbased learning, generalization to new data samples is obtained by predicting them based on a model built on a set of examples [7,13].…”
Section: Instance-based Versus Model-based Learningmentioning
confidence: 99%