2020
DOI: 10.1016/j.comcom.2020.01.061
|View full text |Cite
|
Sign up to set email alerts
|

Handling imbalanced data with concept drift by applying dynamic sampling and ensemble classification model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
12
0
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 34 publications
(15 citation statements)
references
References 13 publications
0
12
0
1
Order By: Relevance
“…The most popular approach lies in combining resampling techniques with Online Bagging (Wang et al, 2015Wang and Pineau, 2016). Similar strategies can be applied to Adaptive Random Forest (Gomes et al, 2017), Online Boosting (Klikowski and Woźniak, 2019;Gomes et al, 2019) 2017), Dynamic Feature Selection (Wu et al, 2014), Adaptive Random Forest with resampling (Ferreira et al, 2019), Kappa Updated Ensemble (Cano and Krawczyk, 2020), Robust Online Self-Adjusting Ensemble (Cano and Krawczyk, 2022) or any ensemble that can incrementally update its base learners (Ancy and Paulraj, 2020;Li et al, 2020). It is interesting to note that preprocessing approaches enhance diversity among base classifiers (Zyblewski et al, 2019).…”
Section: Ensembles For Imbalanced Data Streamsmentioning
confidence: 99%
“…The most popular approach lies in combining resampling techniques with Online Bagging (Wang et al, 2015Wang and Pineau, 2016). Similar strategies can be applied to Adaptive Random Forest (Gomes et al, 2017), Online Boosting (Klikowski and Woźniak, 2019;Gomes et al, 2019) 2017), Dynamic Feature Selection (Wu et al, 2014), Adaptive Random Forest with resampling (Ferreira et al, 2019), Kappa Updated Ensemble (Cano and Krawczyk, 2020), Robust Online Self-Adjusting Ensemble (Cano and Krawczyk, 2022) or any ensemble that can incrementally update its base learners (Ancy and Paulraj, 2020;Li et al, 2020). It is interesting to note that preprocessing approaches enhance diversity among base classifiers (Zyblewski et al, 2019).…”
Section: Ensembles For Imbalanced Data Streamsmentioning
confidence: 99%
“…Similar to the ranked-based approach, a heterogeneous dynamic weighted majority [20] method is also applied to the ensemble modeling. Ancy and Paulraj [39] have used dynamic sampling on their proposed ensemble model to handle imbalanced data with concept drift. A mixture of neural network and support vector machine-based ensemble methods using transfer learning and incremental learning is developed [21] to lower the effects of concept drift.…”
Section: Literature Reviewmentioning
confidence: 99%
“…Several chunk-based approaches have been proposed based on the storage of old minority class examples to help overcoming class imbalance [1,14,15,27,28,35,45,82]. Some of these approaches train components by combining all minority class examples seen so far with majority examples from the most recent chunk [27,28,45,82].…”
Section: Classifiers For Imbalanced Data Streamsmentioning
confidence: 99%
“…A detailed description of the generator is given in Section A of the supplementary materials. 1 The source code of a MOA [2] compatible implementation of the generator is available at: https://github.com/dabrze/imbalanced-stream-generator.…”
Section: Experimental Aims and Setupmentioning
confidence: 99%