2016
DOI: 10.1016/j.chemolab.2016.06.013
|View full text |Cite
|
Sign up to set email alerts
|

A new concept of higher-order similarity and the role of distance/similarity measures in local classification methods

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
8
0

Year Published

2016
2016
2022
2022

Publication Types

Select...
5
3
1

Relationship

1
8

Authors

Journals

citations
Cited by 30 publications
(8 citation statements)
references
References 34 publications
0
8
0
Order By: Relevance
“…Todeschini et al 22,23 analyzed the effect of 18 different distance measures on the performance of KNN classifier using eight benchmark data sets. The investigated distance measures included MD, ED, Soergel distance (SoD), Lance-Williams distance, contracted Jaccard-Tanimoto distance, Jaccard-Tanimoto distance, Bhattacharyya distance (BD), Lagrange distance, Mahalanobis distance, Canberra distance (CanD), Wave-Edge distance, Clark distance (ClaD), CosD, CorD, and four locally centered Mahalanobis distances.…”
Section: Introductionmentioning
confidence: 99%
“…Todeschini et al 22,23 analyzed the effect of 18 different distance measures on the performance of KNN classifier using eight benchmark data sets. The investigated distance measures included MD, ED, Soergel distance (SoD), Lance-Williams distance, contracted Jaccard-Tanimoto distance, Jaccard-Tanimoto distance, Bhattacharyya distance (BD), Lagrange distance, Mahalanobis distance, Canberra distance (CanD), Wave-Edge distance, Clark distance (ClaD), CosD, CorD, and four locally centered Mahalanobis distances.…”
Section: Introductionmentioning
confidence: 99%
“…For example, in [28] it was showed that the Manhattan distance outperforms Chebyshev and Euclidean metrics on intrusion data. Good performance of the Manhattan distance was also demonstrated in [29,30].…”
Section: K-nearest Neighbors (Knn)mentioning
confidence: 78%
“…1 According to [21], the k-NN classifier can be used to classify new data objects using only their distance to labelled samples. However, some works consider any metric or non-metric measures used with this classifier: several studies have been conducted to evaluate the k-NN classifier using different metric and non-metric measures such as the studies presented in [7,10,[22][23][24][25][26].…”
Section: K-nearest Neighbour Classifier (K-nn)mentioning
confidence: 99%
“…However, k-NN can also be applied to other type of data includes categorical data [6]. Several investigations have been done to find a proper categorical measures for such data, such as the works presented in [7][8][9][10][11][12].…”
Section: Introductionmentioning
confidence: 99%