2019
DOI: 10.1007/978-3-030-16145-3_46
|View full text |Cite
|
Sign up to set email alerts
|

Efficient Autotuning of Hyperparameters in Approximate Nearest Neighbor Search

Abstract: Approximate nearest neighbor algorithms are used to speed up nearest neighbor search in a wide array of applications. However, current indexing methods feature several hyperparameters that need to be tuned to reach an acceptable accuracy-speed trade-off. A grid search in the parameter space is often impractically slow due to a time-consuming index-building procedure. Therefore, we propose an algorithm for automatically tuning the hyperparameters of indexing methods based on randomized space-partitioning trees.… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
4
3

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(4 citation statements)
references
References 16 publications
(41 reference statements)
0
4
0
Order By: Relevance
“…Easy speed-ups include replacing the current NN method 53 by approximated and parallelized versions such as FLANN. 51,52 Multiple random projection trees (MRPT) 77,78 reduces expensive distance evaluations, thereby achieving higher speed than ANN and FLANN. However, MRPT has several issues: an insufficient number of requested nearest neighbors are returned; single-precision floating point is used; retrieving distances between neighbor points is not easily supported; and inaccurate search – in the worst case, points that are far away from the query point are returned as nearest neighbors.…”
Section: Discussionmentioning
confidence: 99%
“…Easy speed-ups include replacing the current NN method 53 by approximated and parallelized versions such as FLANN. 51,52 Multiple random projection trees (MRPT) 77,78 reduces expensive distance evaluations, thereby achieving higher speed than ANN and FLANN. However, MRPT has several issues: an insufficient number of requested nearest neighbors are returned; single-precision floating point is used; retrieving distances between neighbor points is not easily supported; and inaccurate search – in the worst case, points that are far away from the query point are returned as nearest neighbors.…”
Section: Discussionmentioning
confidence: 99%
“…Hyperparameter tuning is an important optimization technique in machine learning algorithms, particularly for k-NN [38]. A grid search is essential for selecting the best hyperparameter configuration, which includes the value of 'k' and the wsix scheme used in k-NN [39].…”
Section: Hyperparameter Tuningmentioning
confidence: 99%
“…MRPT starts with creating a binary space-partitioning tree index, utilizing the auto-tuning process proposed in [29]. This process utilizes the sparse random-projections trees (sRPT) for which the parametrization is done according to the nearest neighbors' target recall level, provided to the algorithm along with the (k) number of nearest neighbors to search for.…”
Section: Speeding Up the Knn Classificationmentioning
confidence: 99%