Proceedings of the 30th ACM International Conference on Information &Amp; Knowledge Management 2021
DOI: 10.1145/3459637.3482132
|View full text |Cite
|
Sign up to set email alerts
|

Learning Sparse Binary Code for Maximum Inner Product Search

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
2
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
2
2

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(3 citation statements)
references
References 21 publications
0
2
0
Order By: Relevance
“…The hashing methods (Shrivastava and Li 2014;Neyshabur and Srebro 2015;Andoni et al 2015;Chen et al 2019;Ma et al 2021) usually partition the space using a similaritypreserving hash function, find the relevant buckets for a given query and score only the items in these buckets.…”
Section: Mipsmentioning
confidence: 99%
“…The hashing methods (Shrivastava and Li 2014;Neyshabur and Srebro 2015;Andoni et al 2015;Chen et al 2019;Ma et al 2021) usually partition the space using a similaritypreserving hash function, find the relevant buckets for a given query and score only the items in these buckets.…”
Section: Mipsmentioning
confidence: 99%
“…Faiss 4 is a library for efficient similarity search and clustering of dense vectors. Faiss employs the MIPS (Maximum Inner Product Search) to find the document vectors with the highest inner product with the query vector (Mussmann and Ermon, 2016).…”
Section: A Appendixmentioning
confidence: 99%
“…The sparse inner product (SIP) [1], [2], as the basis for sparse linear algebra including matrix multiplication, matrix-vector inner product, and matrix inversion, has shown an irreplaceable role in various applications, especially in accelerating large-scale machine learning (ML) where sparsity is intertwined. Take the classification task with 20Newsgroups dataset [3] as an example.…”
Section: Introductionmentioning
confidence: 99%