2008
DOI: 10.1007/978-3-540-85920-8_59
|View full text |Cite
|
Sign up to set email alerts
|

An Empirical Study for the Multi-class Imbalance Problem with Neural Networks

Abstract: The latest research in neural networks demonstrates that the class imbalance problem is a critical factor in the classifiers performance when working with multi-class datasets. This occurs when the number of samples of some classes is much smaller compared to other classes. In this work, four different options to reduce the influence of the class imbalance problem in the neural networks are studied. These options consist of introducing several cost functions in the learning algorithm in order to improve the ge… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2011
2011
2018
2018

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 12 publications
(9 citation statements)
references
References 8 publications
0
9
0
Order By: Relevance
“…Machine learning community suggests many tricks to solve the deficiency of its models in predicting the small classes like U2R and R2L classes of ID problem, it is the problem mentioned in the previous review. Different approaches suggested [14,15] to solve the imbalanced classes like resampling techniques and algorithmic approach. The oversampling and undersampling are the common two resampling methods in literature while the cost function was added to different machine learning algorithms to address it's sensitivity to imbalanced classes.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Machine learning community suggests many tricks to solve the deficiency of its models in predicting the small classes like U2R and R2L classes of ID problem, it is the problem mentioned in the previous review. Different approaches suggested [14,15] to solve the imbalanced classes like resampling techniques and algorithmic approach. The oversampling and undersampling are the common two resampling methods in literature while the cost function was added to different machine learning algorithms to address it's sensitivity to imbalanced classes.…”
Section: Related Workmentioning
confidence: 99%
“…The oversampling and undersampling are the common two resampling methods in literature while the cost function was added to different machine learning algorithms to address it's sensitivity to imbalanced classes. Different cost functions suggested in both works and applied with Neural Networks [15] and Extreme machine learning (ELM) [14] algorithms. This improves the prediction for the small classes in all used data set.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Some learning algorithms based on cost-errors, such as those in [29], have been proposed and shown to be efficient in dealing with imbalanced classification problems. So in this paper, in order to correct the learning deviation of the widths, we will also adopt the cost-errors using the count ratio of the samples of each region to modify the objective function in (2).…”
Section: The Imbalanced Problemmentioning
confidence: 99%
“…Classification problems that breach this tacit assumption are ill suited to such algorithms (Alejo et al, 2008). The learning objectives of these algorithms focus on classification accuracy of the training data set; under-sampled classes are poorly represented during mean-squared error like decision rules and the resultant classification rule becomes unduly biased (Sun et al, 2009).…”
Section: Introductionmentioning
confidence: 99%