2020
DOI: 10.1016/j.knosys.2020.105694
|View full text |Cite
|
Sign up to set email alerts
|

Incremental learning imbalanced data streams with concept drift: The dynamic updated ensemble algorithm

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
32
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 57 publications
(32 citation statements)
references
References 26 publications
0
32
0
Order By: Relevance
“…This pattern signifies a synthesized vision of data records that were analyzed in the past and it progressively analysis for the availability of new records. On-line and incremental learning methods are used for dealing with the rapid arrival of continuous data, unbounded streams, and time-varying data [24][25][26]. Online environment is a non-stationary one that copes with the learning issues in big data conditions [27].…”
Section: Introductionmentioning
confidence: 99%
“…This pattern signifies a synthesized vision of data records that were analyzed in the past and it progressively analysis for the availability of new records. On-line and incremental learning methods are used for dealing with the rapid arrival of continuous data, unbounded streams, and time-varying data [24][25][26]. Online environment is a non-stationary one that copes with the learning issues in big data conditions [27].…”
Section: Introductionmentioning
confidence: 99%
“…An error cost function model was designed to guide the Convolutional Neural Network (CNN) parameters optimization in the direction of feature classification and was applied to the heavy-duty industrial robot system diagnosis procedure [19]. By using ensemble learning, Li et al proposed a Dynamic Updated Ensemble (DUE) for learning imbalanced data streams with concept drift [20].…”
Section: Introductionmentioning
confidence: 99%
“…Compared to online methods, the model learns faster at the beginning of training in a chunk-based approach since the model is seeded with a big initial chunk of data that makes the update process more effective; however, a concept drift may occur later in the learning process, and if the drift is located within a chunk, it is possible that it will be missed, resulting in a significant reduction in model accuracy. In the case of online learning, the latter problem does not affect the performance of the model; however, the model may suffer from slower initial learning performance [37].…”
Section: Introductionmentioning
confidence: 99%
“…Another problem of such a model is its runtime. Compared to the chunk-based models, online learning methods are less efficient and require more computations [37].…”
Section: Introductionmentioning
confidence: 99%