Proceedings of the 36th International ACM SIGIR Conference on Research and Development in Information Retrieval 2013
DOI: 10.1145/2484028.2484162
|View full text |Cite
|
Sign up to set email alerts
|

Neighbourhood preserving quantisation for LSH

Abstract: We introduce a scheme for optimally allocating multiple bits per hyperplane for Locality Sensitive Hashing (LSH). Existing approaches binarise LSH projections by thresholding at zero yielding a single bit per dimension. We demonstrate that this is a sub-optimal bit allocation approach that can easily destroy the neighbourhood structure in the original feature space. Our proposed method, dubbed Neighbourhood Preserving Quantization (NPQ), assigns multiple bits per hyperplane based upon adaptively learned thresh… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
16
0

Year Published

2014
2014
2018
2018

Publication Types

Select...
3
2
1

Relationship

1
5

Authors

Journals

citations
Cited by 15 publications
(16 citation statements)
references
References 7 publications
0
16
0
Order By: Relevance
“…In the most closely related past research authors have generally focused on either learning the hashing hyperplanes [1,4,6] or the quantisation thresholds [2,3,7,8] based on the distribution of the data. Seminal approaches for datadependent hyperplane learning either solving an eigenvalue problem to generate a set of orthogonal hyperplanes, for example using Principal Components Analysis (PCA) [9], or frame a custom objective functions that uses pairwise labels to appropriately position the hyperplanes within the feature space [1].…”
Section: Related Workmentioning
confidence: 99%
See 4 more Smart Citations
“…In the most closely related past research authors have generally focused on either learning the hashing hyperplanes [1,4,6] or the quantisation thresholds [2,3,7,8] based on the distribution of the data. Seminal approaches for datadependent hyperplane learning either solving an eigenvalue problem to generate a set of orthogonal hyperplanes, for example using Principal Components Analysis (PCA) [9], or frame a custom objective functions that uses pairwise labels to appropriately position the hyperplanes within the feature space [1].…”
Section: Related Workmentioning
confidence: 99%
“…This approach is commonly known as Single Bit Quantisation (SBQ). Recently there has been significant interest in improving on SBQ by learning one or more thresholds to quantise projections [2,3,8,7]. These quantisation models use either an unsupervised objective such as k-means or squared error minimisation to learn a good set of thresholds for each projected dimension [7,8], or propose a semi-supervised objective that takes into consideration the neighbourhood structure between the data-points in the input feature space [2,3].…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations