2018
DOI: 10.1007/978-981-13-1648-7_2
|View full text |Cite
|
Sign up to set email alerts
|

Mutual-Information-SMOTE: A Cost-Free Learning Method for Imbalanced Data Classification

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 18 publications
0
1
0
Order By: Relevance
“…Another work on text classification also introduced a successful MI-constrained oversampling mechanism (MISO) that safely and robustly re-embeds challenging samples [135]. The MI-based SMOTE is also widely applied to multiple aspects with various classifiers, including the MI classifier [136], the KNN classifier, and the decision tree classifier [137]. Several studies also show that MI performs well for FTS problems.…”
Section: Information Theorymentioning
confidence: 99%
“…Another work on text classification also introduced a successful MI-constrained oversampling mechanism (MISO) that safely and robustly re-embeds challenging samples [135]. The MI-based SMOTE is also widely applied to multiple aspects with various classifiers, including the MI classifier [136], the KNN classifier, and the decision tree classifier [137]. Several studies also show that MI performs well for FTS problems.…”
Section: Information Theorymentioning
confidence: 99%