2011
DOI: 10.1007/978-3-642-23866-6_11
|View full text |Cite
|
Sign up to set email alerts
|

Control of Variables in Reducts - kNN Classification with Confidence

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2014
2014
2017
2017

Publication Types

Select...
5
1

Relationship

1
5

Authors

Journals

citations
Cited by 15 publications
(2 citation statements)
references
References 9 publications
0
2
0
Order By: Relevance
“…Reduct conception is useful method for the classification. The proposed classification is shown in Fig.1, which consists of some reducts followed by respective Nearest Neighbor(NN) system 10,11 . Nearest neighbor is different from other classification methods such as linear discriminate analysis, neural networks, support vector machine or kernel methods 13 .…”
Section: Generation Of Reducts Based On Nearest Neighbor Relationmentioning
confidence: 99%
See 1 more Smart Citation
“…Reduct conception is useful method for the classification. The proposed classification is shown in Fig.1, which consists of some reducts followed by respective Nearest Neighbor(NN) system 10,11 . Nearest neighbor is different from other classification methods such as linear discriminate analysis, neural networks, support vector machine or kernel methods 13 .…”
Section: Generation Of Reducts Based On Nearest Neighbor Relationmentioning
confidence: 99%
“…Then, for the efficient and speedy classification of the parallel processing, nearest neighbor method by Cover and Hart 8 is simple, efficient and speedy with multi-reducts. We propose here a parallel processing scheme of multireducts followed by nearest neighbor classification 10,11,12,13 . Nearest neighbor relation 11,12 between different classes has a b asic information for classification.…”
Section: Introductionmentioning
confidence: 99%