2017 8th Annual Industrial Automation and Electromechanical Engineering Conference (IEMECON) 2017
DOI: 10.1109/iemecon.2017.8079572
|View full text |Cite
|
Sign up to set email alerts
|

Optical character recognition using KNN on custom image dataset

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
5

Citation Types

0
11
0

Year Published

2018
2018
2022
2022

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 21 publications
(12 citation statements)
references
References 0 publications
0
11
0
Order By: Relevance
“…In the case of k > 1, a voting of majority decision is made to determine the class of the unknown sample. However, it can be seen from [5], [14], [17] that k = 1 consistently yields the highest accuracy and was verified by experimentation through the development process. Training the KNN classifier means saving the feature vectors from the training samples as opposed to other classifiers such as Support Vector Machines (SVM) and Artificial Neural Networks (ANN) in which parameters adapt or learn from the training samples.…”
Section: K-nearest Neighbor (Knn) As Character Classifiermentioning
confidence: 88%
See 3 more Smart Citations
“…In the case of k > 1, a voting of majority decision is made to determine the class of the unknown sample. However, it can be seen from [5], [14], [17] that k = 1 consistently yields the highest accuracy and was verified by experimentation through the development process. Training the KNN classifier means saving the feature vectors from the training samples as opposed to other classifiers such as Support Vector Machines (SVM) and Artificial Neural Networks (ANN) in which parameters adapt or learn from the training samples.…”
Section: K-nearest Neighbor (Knn) As Character Classifiermentioning
confidence: 88%
“…KNN is an effective and widely used classifier in the industry despite is simplicity [4], [5]. It also has high fault tolerance over non-linear multiclass problems because it does not assume any models for the distribution of feature vectors in space [5], [8], [14].…”
Section: K-nearest Neighbor (Knn) As Character Classifiermentioning
confidence: 99%
See 2 more Smart Citations
“…In training phase, KNN uses multidimensional feature vector space that assigns a class label to each training sample. Many researchers have suggested the use of KNN classifier in text/digits recognition and classification such as Hazra et al [22] who presented the concept of KNN classifier for both handwritten and printed letters recognition in English language based on sophisticated feature extractor technique.…”
Section: B K-nearest Neighbor (Knn)mentioning
confidence: 99%