2020
DOI: 10.11591/eei.v9i1.1464
|View full text |Cite
|
Sign up to set email alerts
|

Optimization of distance formula in K-Nearest Neighbor method

Abstract: K-Nearest Neighbor (KNN) is a method applied in classifying objects based on learning data that is closest to the object based on comparison between previous and current data. In the learning process, KNN calculates the distance of the nearest neighbor by applying the euclidean distance formula, while in other methods, optimization has been done on the distance formula by comparing it with the other similar in order to get optimal results. This study will discuss the calculation of the euclidean distance formu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
29
0
6

Year Published

2020
2020
2024
2024

Publication Types

Select...
7
3

Relationship

0
10

Authors

Journals

citations
Cited by 70 publications
(35 citation statements)
references
References 14 publications
0
29
0
6
Order By: Relevance
“…The KNN classification algorithm works on the basis of the Euclidean distance and calculates the probability of the test inputs that are in close proximity to the data points. 31 The results acquired in our study indicate lowest accuracies generated via LDA for both FAP and FHP conditions. LDA executes mathematical operation on the basis of probability estimation and estimates a new set of inputs that belong to each class.…”
Section: Discussionmentioning
confidence: 54%
“…The KNN classification algorithm works on the basis of the Euclidean distance and calculates the probability of the test inputs that are in close proximity to the data points. 31 The results acquired in our study indicate lowest accuracies generated via LDA for both FAP and FHP conditions. LDA executes mathematical operation on the basis of probability estimation and estimates a new set of inputs that belong to each class.…”
Section: Discussionmentioning
confidence: 54%
“…The formation of a cluster is based on the Euclidian formula. If nodes are in the range of Euclidian distance [69], then they become neighbors. These neighboring nodes are used in the cluster formation on the basis of density.…”
Section: K Aes-clustering Algorithmmentioning
confidence: 99%
“…In obtaining a small error value, of course, a technique of measuring accuracy must be carried out as is done by Lubis [24] to analyse bank customer data and measure the accuracy value looking for the smallest error with MAPE and MSE so that in his research the smallest error value was achieved using MAPE. In addition, reference [25] also applies the smallest error value accuracy in the Simple Evolving Connectionist System method using MAPE but is influenced by the normalized distance formula.…”
Section: Introductionmentioning
confidence: 99%