2014
DOI: 10.1007/978-3-319-14717-8_28
|View full text |Cite
|
Sign up to set email alerts
|

Efficient kNN Algorithm Based on Graph Sparse Reconstruction

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2

Citation Types

0
23
0

Year Published

2017
2017
2023
2023

Publication Types

Select...
4
2

Relationship

1
5

Authors

Journals

citations
Cited by 26 publications
(26 citation statements)
references
References 30 publications
0
23
0
Order By: Relevance
“…In our experiments, we selected state-of-the-art methods, including the standard kNN method, the kNN with cross-validation determined parameter k (we briefly denote it as CV-kNN), the L-kNN method [Zhang et al 2014] (compared to CM-kNN, it doesn't consider the importance of removing the noisy data points), the LL-kNN method (compared to CM-kNN, it doesn't consider the importance of preserving the local consistency of the structures of the data), the kNN-based Applicability Domain approach (AD-kNN) [Sahigara et al 2014], and the Large Margin Nearest Neighbor approach (LMNN) [Weinberger and Saul 2006].…”
Section: Competing Methodsmentioning
confidence: 99%
See 3 more Smart Citations
“…In our experiments, we selected state-of-the-art methods, including the standard kNN method, the kNN with cross-validation determined parameter k (we briefly denote it as CV-kNN), the L-kNN method [Zhang et al 2014] (compared to CM-kNN, it doesn't consider the importance of removing the noisy data points), the LL-kNN method (compared to CM-kNN, it doesn't consider the importance of preserving the local consistency of the structures of the data), the kNN-based Applicability Domain approach (AD-kNN) [Sahigara et al 2014], and the Large Margin Nearest Neighbor approach (LMNN) [Weinberger and Saul 2006].…”
Section: Competing Methodsmentioning
confidence: 99%
“…, 10, and the square root of the sample size. -The L-kNN method [Zhang et al 2014, that is, Equation (6) with the setting ρ 2 = 0, on which we would like to show the importance of removing the noisy data points. -The LL-kNN method, that is, Equation (6) with the setting ρ 3 = 0, on which we would like to show the importance of preserving the local consistency of the structures of the data.…”
Section: Competing Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…To develop Lasso-based SR classification systems, group lasso (GL) [17] [18], which uses the ' 2 -norm within a group and the ' 1 -norm between groups, has been applied to obtain sparse coefficients [19][20][21][22][23]. In [19], a localityconstrained GL coding method is used for microvessel image classification and realizes the automatic ''hot spot'' detection of angiogenesis for human liver carcinoma.…”
Section: Introductionmentioning
confidence: 99%