Proceedings of the ACM Web Conference 2023 2023
DOI: 10.1145/3543507.3583361
|View full text |Cite
|
Sign up to set email alerts
|

Debiased Contrastive Learning for Sequential Recommendation

Abstract: Current sequential recommender systems are proposed to tackle the dynamic user preference learning with various neural techniques, such as Transformer and Graph Neural Networks (GNNs). However, inference from the highly sparse user behavior data may hinder the representation ability of sequential pattern encoding. To address the label shortage issue, contrastive learning (CL) methods are proposed recently to perform data augmentation in two fashions: (i) randomly corrupting the sequence data (e.g., stochastic … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
4
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 41 publications
(4 citation statements)
references
References 33 publications
0
4
0
Order By: Relevance
“…The introduction of contrastive learning in sequence recommendation systems mainly solves the problems of sparse user-item interaction and noise. Scholars improve recommendation performance by designing auxiliary tasks or loss functions [39]. CBiT [40] combines the cloze task mask and the dropout mask to generate high-quality positive samples and perform multi-pair contrastive learning.…”
Section: Contrastive Learningmentioning
confidence: 99%
See 1 more Smart Citation
“…The introduction of contrastive learning in sequence recommendation systems mainly solves the problems of sparse user-item interaction and noise. Scholars improve recommendation performance by designing auxiliary tasks or loss functions [39]. CBiT [40] combines the cloze task mask and the dropout mask to generate high-quality positive samples and perform multi-pair contrastive learning.…”
Section: Contrastive Learningmentioning
confidence: 99%
“…ICLRec [41] models user intentions through clustering of item sequences, maximizing the agreement between a view of the sequence and its corresponding intentions to improve recommendation performance. DCRec [39] employs contrastive learning to learn consistent perception enhancement representations from sequential pattern encoding and global collaborative relationship modeling.…”
Section: Contrastive Learningmentioning
confidence: 99%
“…Self-Supervised Learning for Recommendation. Recently, data augmentation with self-supervised learning (SSL) has emerged as a promising approach for mitigating the label scarcity and noise issue in recommender systems [30,32]. One important SSL paradigm is contrastive learning-based augmentation, where semantic-relevant instances are aligned with sampled positive pairs, while unrelated samples as negative pairs are pushed away.…”
Section: Preliminaries and Related Workmentioning
confidence: 99%
“…Contrastive learning: The DCRec model by Yang Y. et al ( 2023 ) leverages debiased contrastive learning to counteract popularity bias and addressing the challenge of disentangling user conformity from genuine interest, focusing on user fairness. The TAGCL framework also capitalizes on the contrastive learning paradigm, ensuring item fairness by reducing biases in social tagging systems (Xu et al, 2023 ).…”
Section: Fairness In Gnn-based Recommender Systemsmentioning
confidence: 99%