Eighth ACIS International Conference on Software Engineering, Artificial Intelligence, Networking, and Parallel/Distributed Com 2007
DOI: 10.1109/snpd.2007.296
|View full text |Cite
|
Sign up to set email alerts
|

An Effective Method To Improve kNN Text Classifier

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
5
0

Year Published

2007
2007
2020
2020

Publication Types

Select...
6
2
2

Relationship

0
10

Authors

Journals

citations
Cited by 13 publications
(7 citation statements)
references
References 7 publications
0
5
0
Order By: Relevance
“…Nonetheless, the classification problem presented here has many rare classes (see Table 1) and some experiments have shown that Precision and F1 measures may not be an adequate metrics for evaluation this kind of problem [8,13]. Thus we are going to adopt a set of more appropriated metrics for this type of problem [15].…”
Section: Evaluating the Resultsmentioning
confidence: 99%
“…Nonetheless, the classification problem presented here has many rare classes (see Table 1) and some experiments have shown that Precision and F1 measures may not be an adequate metrics for evaluation this kind of problem [8,13]. Thus we are going to adopt a set of more appropriated metrics for this type of problem [15].…”
Section: Evaluating the Resultsmentioning
confidence: 99%
“…When k is 1 the test entity is assignedto the class of its nearest neighbour. [38] Feature vectors andclass of the training samples are stored in the training stage ofthe algorithm while in the classification stage, k is decided bythe user. A test vector is classified by assigning it the labelwhich is most common among the k training data neighbours.…”
Section: K-nearest Neighbour Algorithm Classifiermentioning
confidence: 99%
“…The improved KNN algorithm has higher accuracy and flexibility. For example, one of the many KNN variations is based on weighted distance to improve accouracy [12]. Because our work presented here mainly deals with the comparison of the KNN algorithm and the EM algorithm in terms of classification accuracy.…”
Section: K-nearest Neighbor Algorithmmentioning
confidence: 99%