2009
DOI: 10.1007/978-3-642-05224-8_4
|View full text |Cite
|
Sign up to set email alerts
|

Improving Adaptive Bagging Methods for Evolving Data Streams

Abstract: Abstract. We propose two new improvements for bagging methods on evolving data streams. Recently, two new variants of Bagging were proposed: ADWIN Bagging and Adaptive-Size Hoeffding Tree (ASHT) Bagging. ASHT Bagging uses trees of different sizes, and ADWIN Bagging uses ADWIN as a change detector to decide when to discard underperforming ensemble members. We improve ADWIN Bagging using Hoeffding Adaptive Trees, trees that can adaptively learn from data streams that change over time. To speed up the time for ad… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
26
0

Year Published

2012
2012
2024
2024

Publication Types

Select...
4
3
1

Relationship

1
7

Authors

Journals

citations
Cited by 50 publications
(26 citation statements)
references
References 16 publications
(23 reference statements)
0
26
0
Order By: Relevance
“…For bagging, using 10,000 instances for training in the classifier, the percentage correctly classified 8 For a more thorough analysis (more datasets), please see [20,174]. The main problem with this online method of boosting and bagging, is its inability to deal with drifting concepts since it assumes a stationary distribution [49].…”
Section: Overviewmentioning
confidence: 99%
See 2 more Smart Citations
“…For bagging, using 10,000 instances for training in the classifier, the percentage correctly classified 8 For a more thorough analysis (more datasets), please see [20,174]. The main problem with this online method of boosting and bagging, is its inability to deal with drifting concepts since it assumes a stationary distribution [49].…”
Section: Overviewmentioning
confidence: 99%
“…close t−i − SMA(n 2 )) 2MovAvgVar(n 1 , n 2 ) = Moving Average of Variance(n 1 ) Moving Average of Variance(n 2 ) Demonstrating price with MovAvgVar(5,20) …”
mentioning
confidence: 99%
See 1 more Smart Citation
“…Within the framework, it is possible to define the probability that instances of the stream belong to the new concept after the drift. It uses the sigmoid function, as an elegant and practical solution [15,21].…”
Section: Classification In Moamentioning
confidence: 99%
“…A VFDT like algorithm for learning regression and model trees appear in [39]. Bagging and boosting ensemble models, using VFDT like algorithms, appear in [14,13].…”
Section: Predictive Learning From Data Streamsmentioning
confidence: 99%