2021
DOI: 10.1109/access.2021.3118568
|View full text |Cite
|
Sign up to set email alerts
|

A Pruning Optimized Fast Learn++NSE Algorithm

Abstract: Due to the large number of typical applications, it is very important and urgent to study the fast classification learning of accumulated big data in nonstationary environments. The newly proposed algorithm, named Learn++.NSE, is one of the important research results in this research field. And a pruning version, named Learn++.NSE-Error-based, was given for accumulated big data to improve the learning efficiency. However, the studies have found that the Learn++.NSE-Error-based algorithm often encounters a situ… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
0
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
3

Relationship

0
3

Authors

Journals

citations
Cited by 3 publications
(2 citation statements)
references
References 30 publications
0
0
0
Order By: Relevance
“…FASE, LevBag, OBag, OBoost, DWM and Learn++.NSE use HT as the base learner, whereas ARF uses its built in ARFHoeffdingTree as the base learner. As the ensemble time of Learn++.NSE increases exponentially with the number of base classifiers (Chen et al, 2021), we use its pruning enabled version within MOA for execution efficiency.…”
Section: Experimental Methodologymentioning
confidence: 99%
See 1 more Smart Citation
“…FASE, LevBag, OBag, OBoost, DWM and Learn++.NSE use HT as the base learner, whereas ARF uses its built in ARFHoeffdingTree as the base learner. As the ensemble time of Learn++.NSE increases exponentially with the number of base classifiers (Chen et al, 2021), we use its pruning enabled version within MOA for execution efficiency.…”
Section: Experimental Methodologymentioning
confidence: 99%
“…FASE, LevBag, OBag, OBoost, DWM and Learn++.NSE use HT as the base learner, whereas ARF uses its built in ARFHoeffdingTree as the base learner. As the ensemble time of Learn++.NSE increases exponentially with the number of base classifiers (Chen et al, 2021), we use its pruning enabled version within MOA for execution efficiency. Heterogeneous ensembles : We include BLAST, as it is a SOTA technique based on online algorithm selection (Luong et al, 2021). Fading factor based variant of BLAST is used, due to its merits reported by the authors (van Rijn et al, 2018).…”
Section: Experimental Methodologymentioning
confidence: 99%