2007
DOI: 10.3233/ida-2007-11604
|View full text |Cite
|
Sign up to set email alerts
|

Efficient instance-based learning on data streams

Abstract: The processing of data streams in general and the mining of such streams in particular have recently attracted considerable attention in various research fields. A key problem in stream mining is to extend existing machine learning and data mining methods so as to meet the increased requirements imposed by the data stream scenario, including the ability to analyze incoming data in an online, incremental manner, to observe tight time and memory constraints, and to appropriately respond to changes of the data ch… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
39
0
1

Year Published

2011
2011
2022
2022

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 50 publications
(40 citation statements)
references
References 51 publications
0
39
0
1
Order By: Relevance
“…f) Decision stump [52]: one-level decision trees that are updatable by appropriate counts in the information criterion; in addition, this classifier is available in the MOA tool. g) IB1 [45], [53]: an instanced-based classifier [54] where the reference update base is growing-for each classification query, the whole history is available instead of single snapshots of the data stream. We use this classifier from a WEKA tool to demonstrate a rigorous comparison of our novel approach compared with classifiers seeing all data at once.…”
Section: Discussionmentioning
confidence: 99%
“…f) Decision stump [52]: one-level decision trees that are updatable by appropriate counts in the information criterion; in addition, this classifier is available in the MOA tool. g) IB1 [45], [53]: an instanced-based classifier [54] where the reference update base is growing-for each classification query, the whole history is available instead of single snapshots of the data stream. We use this classifier from a WEKA tool to demonstrate a rigorous comparison of our novel approach compared with classifiers seeing all data at once.…”
Section: Discussionmentioning
confidence: 99%
“…The idea is closely related to the work by Beringer and Hullermeier [3] and has the same limitations regarding following the current concept. The former work was presented four years later than the latter.…”
Section: Positioning Within the Related Workmentioning
confidence: 93%
“…The issue of systematic training set selection in space under concept drift has been brought up in [3,[13][14][15][16][17]. Ganti et al [13] give a generic interpretation of systematic training data selection without a real plug-and-play algorithm.…”
Section: Positioning Within the Related Workmentioning
confidence: 99%
See 2 more Smart Citations