2009
DOI: 10.1007/s10844-009-0101-z
|View full text |Cite
|
Sign up to set email alerts
|

Noise reduction for instance-based learning with a local maximal margin approach

Abstract: To some extent the problem of noise reduction in machine learning has been finessed by the development of learning techniques that are noise-tolerant. However, it is difficult to make instance-based learning noise tolerant and noise reduction still plays an important role in k-nearest neighbour classification. There are also other motivations for noise reduction, for instance the elimination of noise may result in simpler models or data cleansing may be an end in itself.In this paper we present a novel approac… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
28
0
2

Year Published

2013
2013
2024
2024

Publication Types

Select...
5
3
1

Relationship

0
9

Authors

Journals

citations
Cited by 44 publications
(31 citation statements)
references
References 55 publications
0
28
0
2
Order By: Relevance
“…Meanwhile, filtering techniques use an independent noise-filtering stage in which noisy instances that meet certain criteria are determined and discarded. Noise filtering has been implemented in different forms with different types of classifiers [26], [6], [27], [28], [4], [9], [10], [3], [5], [8], [29] and has been proven to be effective in improving the classification accuracy [25].…”
Section: A Ir Techniques For Noise Filteringmentioning
confidence: 99%
See 1 more Smart Citation
“…Meanwhile, filtering techniques use an independent noise-filtering stage in which noisy instances that meet certain criteria are determined and discarded. Noise filtering has been implemented in different forms with different types of classifiers [26], [6], [27], [28], [4], [9], [10], [3], [5], [8], [29] and has been proven to be effective in improving the classification accuracy [25].…”
Section: A Ir Techniques For Noise Filteringmentioning
confidence: 99%
“…In Instance-Based Learning (IBL), the effect of noise is mitigated by using a large number of several similar instances instead of just one as done by the k Nearest Neighbor (kNN) algorithm, where k (the number of neighbors) is usually set to 3. Another general approach that can mitigate the effect of noise is to use a noisefiltering algorithm that determines and eliminates the outlier instances [3], [4], [5], [6], [7], [8]. Although, most of these methods were designed for IBL methods, they can also be used to preprocess the training data before using them with other ML approaches, such as decision trees [9] and neural networks [10], [11].…”
Section: Introductionmentioning
confidence: 99%
“…[12], [26]) appearing in literature reveal that RMHC and other two non sampling-based methods, AkNN (All k-NN) [31] and DROP3 (Decremental Reduction Optimization Procedure 3) [32], offer excellent balance between size reduction and accuracy. Both, AkNN and DROP3, build on top of ENN (Edited Nearest Neighbor rule) [31] which edits out noisy instances from datasets and leaves smoother decision boundary.…”
Section: Benchmarkingmentioning
confidence: 99%
“…В алгоритмах построения решающих деревьев для уменьшения влияния таких объектов предусмотрена процедура редукции (pruning) -удаление поддеревьев, имеющих низкую статистическую надежность из-за того, что для их построения использовались объекты-выбросы [4,5]. В других алгоритмах предусмотрена предобработка данных, в процессе которой шумовые объекты с помощь некоторого критерия выявляются и отфильтровываются [6][7][8][9][10][11]. В некоторых случаях предпринимается попытка корректировки отдельных признаков объекта-выброса с целью преобразовать его в типичный объект [4,12,13].…”
Section: Introductionunclassified