2023
DOI: 10.1145/3618298
|View full text |Cite
|
Sign up to set email alerts
|

Contrastive Graph Prompt-tuning for Cross-domain Recommendation

Zixuan Yi,
Iadh Ounis,
Craig MacDonald

Abstract: Recommender systems commonly suffer from the long-standing data sparsity problem where insufficient user-item interaction data limits the systems’ ability to make accurate recommendations. This problem can be alleviated using cross-domain recommendation techniques. In particular, in a cross-domain setting, knowledge sharing between domains permits improved effectiveness on the target domain. While recent cross-domain recommendation techniques used a pre-training configuration, we argue that such techniques lea… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
1
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
2
1
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(2 citation statements)
references
References 52 publications
0
1
0
Order By: Relevance
“…A Triple Sequence Learning (Tri-CDR) model for cross-domain recommendation was proposed (Ma et al, 2023) that achieves precise mining of triple correlations by jointly modeling source domain, target domain, and mixed behavior while considering the global information of recommendation items and user target preferences. A personalized graph prompt recommendation (PGPRec) framework for CDR has been proposed (Yi et al, 2023) that utilizes prompt adjustment to solve the user cold-start problem and improve the training efficiency of the model. The comparison of the latest CDR models is listed in Table 1.…”
Section: Cross-domain Recommendationmentioning
confidence: 99%
“…A Triple Sequence Learning (Tri-CDR) model for cross-domain recommendation was proposed (Ma et al, 2023) that achieves precise mining of triple correlations by jointly modeling source domain, target domain, and mixed behavior while considering the global information of recommendation items and user target preferences. A personalized graph prompt recommendation (PGPRec) framework for CDR has been proposed (Yi et al, 2023) that utilizes prompt adjustment to solve the user cold-start problem and improve the training efficiency of the model. The comparison of the latest CDR models is listed in Table 1.…”
Section: Cross-domain Recommendationmentioning
confidence: 99%
“…Not surprisingly, different pretext tasks capture different self-supervised signals from the graph. For example, link prediction tasks are more concerned with the connectivity or relationship between nodes [40], node/edge featurebased tasks focus more on the feature space [42], and subgraphbased tasks focus more on local or global information [30,55]. To cater to diverse downstream tasks, the pre-training step should aim to broadly extract knowledge from various aspects, such as node connectivity, node or edge features, and local or global graph patterns.…”
Section: Introductionmentioning
confidence: 99%