2019
DOI: 10.1145/3363573
|View full text |Cite
|
Sign up to set email alerts
|

Multi-Label Punitive kNN with Self-Adjusting Memory for Drifting Data Streams

Abstract: In multi-label learning, data may simultaneously belong to more than one class. When multi-label data arrives as a stream, the challenges associated with multi-label learning are joined by those of data stream mining, including the need for algorithms that are fast and flexible, able to match both the speed and evolving nature of the stream. This article presents a punitive k nearest neighbors algorithm with a self-adjusting memory (MLSAMPkNN) for multi-label, drifting data streams. The… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
21
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 40 publications
(24 citation statements)
references
References 65 publications
0
21
0
Order By: Relevance
“…Contrary to active learning, the passive learning adopts an adaption mechanism, for example, [112] used a sophisticated parameterized windowing technique to phase out old examples over time instead of detecting drifts. The use of two self-adjust-memories in [132] allows ML-SAM-kNN to adapt to various of drifts, such as gradual, recurring drift, etc.…”
Section: E Discussion On Multi-label Data Stream Classificationmentioning
confidence: 99%
“…Contrary to active learning, the passive learning adopts an adaption mechanism, for example, [112] used a sophisticated parameterized windowing technique to phase out old examples over time instead of detecting drifts. The use of two self-adjust-memories in [132] allows ML-SAM-kNN to adapt to various of drifts, such as gradual, recurring drift, etc.…”
Section: E Discussion On Multi-label Data Stream Classificationmentioning
confidence: 99%
“…There are several commonly used methods for data stream classification, and each group has a few excellent algorithms. For example, the SAM [16], ML-SAM-kNN [24] and ML-SAMPkNN [25] are typical instance-based learning methods. For ensemble learning methods, the most common approaches are Bagging and Bosting, which are firstly proposed by Oza and Russell [20], like OBA [19] and LB [4].…”
Section: Related Workmentioning
confidence: 99%
“…Some real-world problems are characterized for having instances simultaneously categorized into multiple labels. This problem is known as multi-label learning [19][20]. The complexity of correctly classifying the instance increases with the size of the output space.…”
Section: Data Stream Mining For Online Learningmentioning
confidence: 99%
“…Therefore, it is more difficult to detect and adapt to concept drift. Authors have proposed solutions for multi-label data streams, including self-adjusting windows to identify the more accurate and most recent subset of instances in a sliding window [19]. Moreover, punitive systems have shown that penalizing instances leading to erroneous label predictions and early removing them from the window increase the overall accuracy of the classifier [21].…”
Section: Data Stream Mining For Online Learningmentioning
confidence: 99%