Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval 2019
DOI: 10.1145/3331184.3331206
|View full text |Cite
|
Sign up to set email alerts
|

Compositional Coding for Collaborative Filtering

Abstract: Efficiency is crucial to the online recommender systems, especially for the ones which needs to deal with tens of millions of users and items. Because representing users and items as binary vectors for Collaborative Filtering (CF) can achieve fast user-item affinity computation in the Hamming space, in recent years, we have witnessed an emerging research effort in exploiting binary hashing techniques for CF methods. However, CF with binary codes naturally suffers from low accuracy due to limited representation… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
19
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 15 publications
(19 citation statements)
references
References 32 publications
(68 reference statements)
0
19
0
Order By: Relevance
“…(Wang et al, 2019) exploits graph convolutional network (GCN) to model high-order feature from implicit feedback and distill the ranking information derived from GCN to binarized collaborative filtering to improve the efficiency of online recommendation. (Liu et al, 2019a) introduces a new approach Compositional Coding for Collaborative Filtering (CCCF) that represents each user/item with a set of binary vectors, instead of one as in (Zhang et al, 2016), which are associated with a set of sparse real-value weight vectors. CCCF claims to achieve better recommendation efficiency than many other discrete learning methods.…”
Section: Optimization-based Discretizationmentioning
confidence: 99%
“…(Wang et al, 2019) exploits graph convolutional network (GCN) to model high-order feature from implicit feedback and distill the ranking information derived from GCN to binarized collaborative filtering to improve the efficiency of online recommendation. (Liu et al, 2019a) introduces a new approach Compositional Coding for Collaborative Filtering (CCCF) that represents each user/item with a set of binary vectors, instead of one as in (Zhang et al, 2016), which are associated with a set of sparse real-value weight vectors. CCCF claims to achieve better recommendation efficiency than many other discrete learning methods.…”
Section: Optimization-based Discretizationmentioning
confidence: 99%
“…To reduce the time complexity, an approximate nearest neighbor search based on the Hdidx library is used to index extracted image features and save them in the retrieval engine. This can reduce the search time by converting the image features on the high-dimensional space using the Hdidx library into compact binary codes [30][31][32].…”
Section: Image-based Similar Product Retrieval Modelmentioning
confidence: 99%
“…For efficiency reasons, a number of hashing-based approaches have also been proposed for the recommendation domain. These approaches have primarily focused on the collaborative filtering setting [39,45,65,68,71], but less so on content-aware approaches addressing the cold-start problem. Existing content-aware hashing approaches, DDL [67] and DCMF [42], learn to generate user and item hash codes for use in both standard and cold-start settings, however they both share the problem of generating item hash codes differently depending on whether the item is considered cold-start or not.…”
Section: Chapter 5 Content-aware Neural Hashing For Cold-start Recomm...mentioning
confidence: 99%
“…For example, in a collaborative filtering setting the user hash code represents the query, and depending on the user's historic item interactions, it may be possible to infer that certain underlying properties are more important for the item ranking. While approaches have been proposed for assigning real-valued weights to certain substrings of bits in a hash code [14,45], such a weighting has the problem of making the core similarity computation (e.g., Hamming distance) significantly slower, which limits its usage in large-scale settings where hashing-based solutions are most needed. This leads to the next research question:…”
Section: Chapter 6 Projected Hamming Dissimilarity For Bit-level Impo...mentioning
confidence: 99%