2020 IEEE International Conference on Data Mining (ICDM) 2020
DOI: 10.1109/icdm50108.2020.00029
|View full text |Cite
|
Sign up to set email alerts
|

CITIES: Contextual Inference of Tail-Item Embeddings for Sequential Recommendation

Abstract: Sequential recommendation techniques provide users with product recommendations fitting their current preferences by handling dynamic user preferences over time. Previous studies have focused on modeling sequential dynamics without much regard to which of the best-selling products (i.e., head items) or niche products (i.e., tail items) should be recommended. We scrutinize the structural reason for why tail items are barely served in the current sequential recommendation model, which consists of an item-embeddi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
16
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
2

Relationship

2
5

Authors

Journals

citations
Cited by 9 publications
(16 citation statements)
references
References 24 publications
(30 reference statements)
0
16
0
Order By: Relevance
“…Evaluation Metric We choose HR@K (Hit Ratio), MRR@K (Mean Reciprocal Rank) and NDCG@K (Normalized Discounted Cumulative Gain) for our evaluation metrics. These metrics are widely used in many of previous works (Jang et al 2020;Lv et al 2020). We also compare the results using different K ∈ {10, 50, 100} to verify robustness in various scenarios.…”
Section: Methodsmentioning
confidence: 97%
See 2 more Smart Citations
“…Evaluation Metric We choose HR@K (Hit Ratio), MRR@K (Mean Reciprocal Rank) and NDCG@K (Normalized Discounted Cumulative Gain) for our evaluation metrics. These metrics are widely used in many of previous works (Jang et al 2020;Lv et al 2020). We also compare the results using different K ∈ {10, 50, 100} to verify robustness in various scenarios.…”
Section: Methodsmentioning
confidence: 97%
“…In this section, we formulate a general SBRs' model into an equation with trainable parameters of θ. A SBRs' model can be divided into 1) an embedding layer with θ e , 2) a sequential layer with θ seq , and 3) a recommendation layer (Jang et al 2020;Lv et al 2020). The trainable parameters are consist of the followings: θ = {θ e , θ seq , θ concat (optional)}, where θ e = θ i e , θ u e .…”
Section: General Session-based Recommendation Modelmentioning
confidence: 99%
See 1 more Smart Citation
“…2. Generally, a sequential recommendation model F is composed of three layers: item embedding, sequence modelling, and recommendation layers [28]. The item embedding layer encodes items into a low-dimensional real-valued latent space to measure the similarity between items.…”
Section: Overviewmentioning
confidence: 99%
“…These vectors are very sparse, and the similarity between items is difficult to measure. Recent works [8,11,28,32] employ an embedding layer to resolve these limitations, and this study also uses an item embedding layer. The item embedding layer f : N → R du projects the item index onto a d u -dimensional real-valued dense vector as…”
Section: Item Embedding Layermentioning
confidence: 99%