The platform will undergo maintenance on Sep 14 at about 7:45 AM EST and will be unavailable for approximately 2 hours.
Proceedings of the Fifteenth ACM International Conference on Web Search and Data Mining 2022
DOI: 10.1145/3488560.3498527
|View full text |Cite
|
Sign up to set email alerts
|

Contrastive Meta Learning with Behavior Multiplicity for Recommendation

Abstract: A well-informed recommendation framework could not only help users identify their interested items, but also benefit the revenue of various online platforms (e.g., e-commerce, social media). Traditional recommendation models usually assume that only a single type of interaction exists between user and item, and fail to model the multiplex user-item relationships from multi-typed user behavior data, such as page view, add-to-favourite and purchase. While some recent studies propose to capture the dependencies a… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
34
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
4

Relationship

4
5

Authors

Journals

citations
Cited by 108 publications
(34 citation statements)
references
References 44 publications
(52 reference statements)
0
34
0
Order By: Relevance
“…It aims to learn quality discriminative representations by contrasting positive and negative samples from different views. Several recent attempts have brought the self-supervised learning to the recommendation [21,22,43,44]. For example, SGL [44] performs dropout operations over the graph connection structures with different strategies, i.e.,, node dropout, edge dropout and random walk.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…It aims to learn quality discriminative representations by contrasting positive and negative samples from different views. Several recent attempts have brought the self-supervised learning to the recommendation [21,22,43,44]. For example, SGL [44] performs dropout operations over the graph connection structures with different strategies, i.e.,, node dropout, edge dropout and random walk.…”
Section: Related Workmentioning
confidence: 99%
“…For example, SGL [44] performs dropout operations over the graph connection structures with different strategies, i.e.,, node dropout, edge dropout and random walk. Additionally, CML [43] enhances the recommender system with the consideration of multi-behavior relationships between users and items with contrastive learning. Motivated by these existing contrastive learning frameworks, this work develops a new graph contrastive learning paradigm for recommendation by effectively integrating knowledge graph representation and user-item interaction augmentation.…”
Section: Related Workmentioning
confidence: 99%
“…Contrastive learning has become an effective self-supervised framework, to capture the feature representation consistency under different views [27,37]. It has achieved promising performance in various domains, such as visual data representation [5,28], language data understanding [2,31], graph representation learning [29,51] and recommender systems [22,38,45,46].…”
Section: Contrastive Representation Learningmentioning
confidence: 99%
“…In self-supervised learning paradigms, models explore the supervision signals from the data itself with auxiliary learning tasks. Furthermore, contrastive-based SSL methods aim to reach agreement between generated correlated contrastive views [39]. However, self-supervised learning is relatively less explored in spatialtemporal data prediction.…”
Section: Self-supervised Learningmentioning
confidence: 99%