Learning From Imbalanced Data Sets 2018
DOI: 10.1007/978-3-319-98074-4_11
|View full text |Cite
|
Sign up to set email alerts
|

Learning from Imbalanced Data Streams

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
45
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2

Relationship

0
6

Authors

Journals

citations
Cited by 48 publications
(46 citation statements)
references
References 58 publications
0
45
0
Order By: Relevance
“…For binary outcomes, the Brier score is defined as Brier score = 1 N 140%true t = 1 N ( f t o t ) 2 where f t is the predicted probability for example t , o t is the actual outcome of example t and N is the total number of examples in the sample. Because the Brier score measures the mean squared difference between the predicted probability of a certain outcome for a particular instance and the actual outcome, lower Brier scores indicate better performance (Fernández et al, 2018). However, for imbalanced datasets, the Brier score may appear very promising overall but poor for the rare class (cases; Wallace & Dahabreh, 2012).…”
Section: Methodsmentioning
confidence: 99%
“…For binary outcomes, the Brier score is defined as Brier score = 1 N 140%true t = 1 N ( f t o t ) 2 where f t is the predicted probability for example t , o t is the actual outcome of example t and N is the total number of examples in the sample. Because the Brier score measures the mean squared difference between the predicted probability of a certain outcome for a particular instance and the actual outcome, lower Brier scores indicate better performance (Fernández et al, 2018). However, for imbalanced datasets, the Brier score may appear very promising overall but poor for the rare class (cases; Wallace & Dahabreh, 2012).…”
Section: Methodsmentioning
confidence: 99%
“…Over and under sampling strategies are very popular and effective approaches to deal with the class imbalance problem [21,25,33,50]. To compensate the class imbalance by biasing the process of discrimination, the ROS algorithm randomly replicates samples from the minority classes while the RUS technique randomly eliminates samples from the majority classes, until achieving a relative classes balance [23,60].…”
Section: Sampling Class Imbalance Approachesmentioning
confidence: 99%
“…Moreover, they have been combined with editing methods, such as Tomek's Links (TL) [26], Editing Nearest Neighbor (ENN) [27], Condensed Nearest Neighbor rule (CNN) [28] and others [22,[29][30][31][32][33]63]; i.e., hybrid methods, such as SMOTE+TL, SMOTE+CNN, SMOTE+OSS and SMOTE+ENN [22,64,65]. They have been successfully applied to deal the class imbalance problem [21].…”
Section: Sampling Class Imbalance Approachesmentioning
confidence: 99%
See 2 more Smart Citations