2020 International Joint Conference on Neural Networks (IJCNN) 2020
DOI: 10.1109/ijcnn48605.2020.9207555
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive XGBoost for Evolving Data Streams

Abstract: Boosting is an ensemble method that combines base models in a sequential manner to achieve high predictive accuracy. A popular learning algorithm based on this ensemble method is eXtreme Gradient Boosting (XGB). We present an adaptation of XGB for classification of evolving data streams. In this setting, new data arrives over time and the relationship between the class and the features may change in the process, thus exhibiting concept drift. The proposed method creates new members of the ensemble from mini-ba… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

3
27
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 33 publications
(35 citation statements)
references
References 19 publications
3
27
0
Order By: Relevance
“…ARF is an adaptation of RF, modified to work with evolving data-streams [18]. ARF proved to be an effective online model [7,35] and given that ASXGB is an extension of gradient boosting for online learning, an extension of RF was tested. The AXGB is an online adaptation of XGBoost [35], which inspired the development of ASXGB, and hence, it was fundamental to compare against this model.…”
Section: Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…ARF is an adaptation of RF, modified to work with evolving data-streams [18]. ARF proved to be an effective online model [7,35] and given that ASXGB is an extension of gradient boosting for online learning, an extension of RF was tested. The AXGB is an online adaptation of XGBoost [35], which inspired the development of ASXGB, and hence, it was fundamental to compare against this model.…”
Section: Methodsmentioning
confidence: 99%
“…ARF proved to be an effective online model [7,35] and given that ASXGB is an extension of gradient boosting for online learning, an extension of RF was tested. The AXGB is an online adaptation of XGBoost [35], which inspired the development of ASXGB, and hence, it was fundamental to compare against this model. In the original paper [35], two mechanisms were proposed to update the ensemble to handle concept drift: (a) push strategy: older models are removed before appending newer models, similar to First In First Out; (b) replace strategy: older models are replaced with newer ones [35].…”
Section: Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…The learning algorithms therefore need to take the evolution of underlying data distribution into consideration, while remaining stable on historical but not outdated concepts. Such adaptation is normally enabled by: 1) incorporating new instances from the stream into the model [32], [33], and ii) forgetting the previous outdated knowledge from the model [34], [35].…”
Section: B Data Stream Learningmentioning
confidence: 99%