2016 IEEE 16th International Conference on Data Mining (ICDM) 2016
DOI: 10.1109/icdm.2016.0040
|View full text |Cite
|
Sign up to set email alerts
|

KNN Classifier with Self Adjusting Memory for Heterogeneous Concept Drift

Abstract: Abstract-Data Mining in non-stationary data streams is gaining more attention recently, especially in the context of Internet of Things and Big Data. It is a highly challenging task, since the fundamentally different types of possibly occurring drift undermine classical assumptions such as data independence or stationary distributions. Available algorithms are either struggling with certain forms of drift or require a priori knowledge in terms of a task specific setting. We propose the Self Adjusting Memory (S… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
162
0
1

Year Published

2018
2018
2021
2021

Publication Types

Select...
4
3
2

Relationship

1
8

Authors

Journals

citations
Cited by 176 publications
(165 citation statements)
references
References 27 publications
(39 reference statements)
2
162
0
1
Order By: Relevance
“…The Rialto dataset (Losing et al 2016) consists of images of colorful buildings next to the famous Rialto bridge in Venice, encoded in a normalized 27-dimensional RGB histogram. Images are obtained from time-lapse videos captured by a webcam with fixed position.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…The Rialto dataset (Losing et al 2016) consists of images of colorful buildings next to the famous Rialto bridge in Venice, encoded in a normalized 27-dimensional RGB histogram. Images are obtained from time-lapse videos captured by a webcam with fixed position.…”
Section: Discussionmentioning
confidence: 99%
“…Moreover, existing approaches are specialized for a particular type of change (e.g., sudden, progressive, cyclic). There exist few methods which can handle different types of concept drift, such as (Losing et al 2016;Dongre and Malik 2014;Webb et al 2016;Brzezinski and Stefanowski 2014), however, most of those methods are dedicated for supervised learning problems, where the change is primarily detected by estimating a degradation in the classification performance. Other approaches such as (Kifer et al 2004;Bifet 2010) are designed to explicitly detect, in an unsupervised way, when a change happens.…”
Section: Introductionmentioning
confidence: 99%
“…• Weather dataset (Elwell & Polikar, 2011 (Losing, Hammer & Wersing, 2016). 4 equidistantly separated, squared uniform distributions are moving horizontally with constant speed.…”
Section: Datasetsmentioning
confidence: 99%
“…This approach uses several data memories, or windows of different lengths over an incoming data stream; each window has its own classifier. The SAM-KNN algorithm uses nearest neighbor approach to select the window closest to a new data sample for classification [23]. Nested windows are used in [12] to obtain multiple training sets over the same data that each exclude a region of the data space.…”
Section: A Concept Driftmentioning
confidence: 99%