Fifteenth ACM Conference on Recommender Systems 2021
DOI: 10.1145/3460231.3474228
|View full text |Cite
|
Sign up to set email alerts
|

Shared Neural Item Representations for Completely Cold Start Problem

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
26
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
1
1

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(27 citation statements)
references
References 23 publications
1
26
0
Order By: Relevance
“…While this approach might be effective in recommending items with similar features, we believe this over-constrained setting is limited beyond item features. As pointed in previous study [15], a user who only reads news in sports may still be interested in news on crime, which we believe the method in [18] with tightly coupled towers under performs.…”
Section: Introductionsupporting
confidence: 52%
See 1 more Smart Citation
“…While this approach might be effective in recommending items with similar features, we believe this over-constrained setting is limited beyond item features. As pointed in previous study [15], a user who only reads news in sports may still be interested in news on crime, which we believe the method in [18] with tightly coupled towers under performs.…”
Section: Introductionsupporting
confidence: 52%
“…The recently introduced method [18] based on this two-tower RS has further improved the performance for item coldstart recommendation. In [18], the item representation was directly shared into the user encoder in an attempt to better unify the two representations, which achieved state-of-theart results for item cold-start recommendation. While this approach might be effective in recommending items with similar features, we believe this over-constrained setting is limited beyond item features.…”
Section: Introductionmentioning
confidence: 99%
“…Recent studies [7], [28] suggest that using dynamic parameters in the neural network can improve the recommendation in the cold-start problem. CMML [7] uses weight modulation in the last layer of the neural network to decrease the impact of insignificant features from the previous layer.…”
Section: B Parameter Generation For the Cold-start Problem And Hypern...mentioning
confidence: 99%
“…CMML [7] uses weight modulation in the last layer of the neural network to decrease the impact of insignificant features from the previous layer. Raziperchikolaei et al used dynamic parameters generated from items to handle the item cold-start problem [28]. In these works, only a portion of the parameters in the neural network that predicts item scores is generated.…”
Section: B Parameter Generation For the Cold-start Problem And Hypern...mentioning
confidence: 99%
“…However, a separate neural model needs to be trained to obtain the cold-start embeddings, which limits the scalability of the solution. Raziperchikolaei et al [22] and Zheng et al [45] suggest hybrid and metadata-aware recommendation models to predict implicit feedback for cold-start items of users, respectively. However, those models are not designed for the session-based recommendation setting.…”
Section: Related Workmentioning
confidence: 99%