2017 IEEE International Conference on Bioinformatics and Biomedicine (BIBM) 2017
DOI: 10.1109/bibm.2017.8217782
|View full text |Cite
|
Sign up to set email alerts
|

Feature selection and resampling in class imbalance learning: Which comes first? An empirical study in the biological domain

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
4
1

Citation Types

0
12
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
6

Relationship

0
6

Authors

Journals

citations
Cited by 16 publications
(12 citation statements)
references
References 18 publications
0
12
0
Order By: Relevance
“…Deep metric learning methods may also be used to achieve better performance such as Siamese networks [9]. Future work, may also consider loss functions method that perform well on imbalanced datasets [57][58][59][60][61]. There are some limitations in terms of the number of pain datasets from facial expressions in pain detection research.…”
Section: Classification Models Auc Accuracymentioning
confidence: 99%
“…Deep metric learning methods may also be used to achieve better performance such as Siamese networks [9]. Future work, may also consider loss functions method that perform well on imbalanced datasets [57][58][59][60][61]. There are some limitations in terms of the number of pain datasets from facial expressions in pain detection research.…”
Section: Classification Models Auc Accuracymentioning
confidence: 99%
“…After dealing with missing data and class imbalance, keeping only relevant attributes is a building stone in design of predictive models. According to [33]- [34], feature selection methods can be divided into three categories: filter methods, wrapper methods and embedded methods.…”
Section: Features Selectionmentioning
confidence: 99%
“…The first category, which is filter methods, includes CiS, Fish, Ttest, Info and Gini methods [34]. Variable selection is made regardless of the learning algorithms.…”
Section: Features Selectionmentioning
confidence: 99%
See 2 more Smart Citations