2019
DOI: 10.1007/s10916-018-1154-8
|View full text |Cite
|
Sign up to set email alerts
|

Distribution-Sensitive Unbalanced Data Oversampling Method for Medical Diagnosis

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

1
21
0
2

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 52 publications
(28 citation statements)
references
References 12 publications
1
21
0
2
Order By: Relevance
“…In the method proposed by [12], the authors tackled data imbalance in a medical diagnosis dataset by introducing a distribution sensitive oversampling approach. In the proposed method, the minority samples were divided into noise samples, unstable samples, boundary samples, and stable samples according to their location in the distribution.…”
Section: Methods Used For the Data Imbalance Problemmentioning
confidence: 99%
“…In the method proposed by [12], the authors tackled data imbalance in a medical diagnosis dataset by introducing a distribution sensitive oversampling approach. In the proposed method, the minority samples were divided into noise samples, unstable samples, boundary samples, and stable samples according to their location in the distribution.…”
Section: Methods Used For the Data Imbalance Problemmentioning
confidence: 99%
“…In [15], Han and coworkers proposed Distribution-Sensitive (DS). This is an oversampling algorithm for Medical Diagnosis for imbalanced data.…”
Section: Related Workmentioning
confidence: 99%
“…In the training set, oversampling techniques are used to increase the number of minority class so that when a model fitting is carried out, a balanced dataset can be obtained. • Random oversampling method (ROS) performs duplication by selecting a random set of minority classes for random data replication [22]. Because the sampling process is carried out randomly, random oversampling has a disadvantage, namely that it requires a long training time, and there is the possibility of overfitting.…”
Section: Introductionmentioning
confidence: 99%