2016
DOI: 10.1016/j.eswa.2015.10.001
|View full text |Cite
|
Sign up to set email alerts
|

Associative learning on imbalanced environments: An empirical study

Abstract: Associative memories have emerged as a powerful computational neural network model for several pattern classification problems. Like most traditional classifiers, these models assume that the classes share similar prior probabilities. However, in many real-life applications the ratios of prior probabilities between classes are extremely skewed. Although the literature has provided numerous studies that examine the performance degradation of renowned classifiers on different imbalanced scenarios, so far this ef… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
7
0

Year Published

2016
2016
2021
2021

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(10 citation statements)
references
References 62 publications
0
7
0
Order By: Relevance
“…For further evaluation, we compared our method with recent state-of-the-art techniques using the most recent results and reported the same experiment settings and datasets. First, we consider Cleofas-Sanchez et al [47], who attempted class-imbalance classification using 31 of the datasets through a hybrid associative classifier with translation (HACT) based on SMOTE and used Gmean for evaluating the results. Table 4 lists the performance of CDSMOTE against this method.…”
Section: Resultsmentioning
confidence: 99%
See 3 more Smart Citations
“…For further evaluation, we compared our method with recent state-of-the-art techniques using the most recent results and reported the same experiment settings and datasets. First, we consider Cleofas-Sanchez et al [47], who attempted class-imbalance classification using 31 of the datasets through a hybrid associative classifier with translation (HACT) based on SMOTE and used Gmean for evaluating the results. Table 4 lists the performance of CDSMOTE against this method.…”
Section: Resultsmentioning
confidence: 99%
“…Table 6 compares CDSMOTE performance against BEPILD using the two metrics reported by the authors (AUC and Gmean). Table 4 compares CDSMOTE with [47] in terms of Gmean.Notice that CDSMOTE obtains the better results in 20 out of 31 datasets. Using a paired t test on the 20 datasets where CDSMOTE wins shows a statistically significant difference with p value equal to 0.000506.…”
Section: Resultsmentioning
confidence: 99%
See 2 more Smart Citations
“…It is possible to find several state-of-the-art articles [48][49][50][51][52] in which the preprocessing of datasets is employed to reduce the impact caused by the distribution of classes. In such research, it has been empirically demonstrated that the application of a preprocessing stage to balance the distribution of classes is usually a useful solution to improve the quality of the identification of new instances.…”
Section: Sampling Algorithms For Imbalanced Datamentioning
confidence: 99%