Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery &Amp; Data Mining 2021
DOI: 10.1145/3447548.3467140
|View full text |Cite
|
Sign up to set email alerts
|

Dual Attentive Sequential Learning for Cross-Domain Click-Through Rate Prediction

Abstract: Cross domain recommender system constitutes a powerful method to tackle the cold-start and sparsity problem by aggregating and transferring user preferences across multiple category domains. Therefore, it has great potential to improve click-through-rate prediction performance in online commerce platforms having many domains of products. While several cross domain sequential recommendation models have been proposed to leverage information from a source domain to improve CTR predictions in a target domain, they… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
10
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 36 publications
(10 citation statements)
references
References 63 publications
0
10
0
Order By: Relevance
“…Base Cat. Enhancements HRNN-meta [157] GRU, RNN CE, CA Hierarchical sessions for inter-and intra-session relations CTA [158] bi-RNN, ATT CA Modeling temporal dynamics with contextualized self-attention CSRM [159] RNN, ATT CF (I) Collaborative information in session-based recommendation DASL [160] GRU, ATT CA Cross-domain data recommendation with dual attention CAML [161] GRU, ATT, FM CE Enhance accuracy and explainability through cross-task learning CatDM [162] LSTM, ATT CE, CA Integration of POI categories, geographical and temporal influences ESRM-KG [163] Tran, GRU CE, CA Multi-task learning by generating keywords to extract intent SDM [164] LSTM, ATT CE, CA Dynamic preference modeling using short-and long-term behaviors SSRM [165] GRU most popularly implementing a gating mechanism such as Long Short-Term Memory (LSTM) units [170] and Gated Recurrent Units (GRU) [171], such as to solve the various challenges faced by vanilla RNNs (e.g., the vanishing gradient problem). Furthermore, recent years have seen a dramatic increase in the proposal of approaches utilizing the attention mechanism [172], [173], which was indeed first popularized in the context of recurrent networks.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…Base Cat. Enhancements HRNN-meta [157] GRU, RNN CE, CA Hierarchical sessions for inter-and intra-session relations CTA [158] bi-RNN, ATT CA Modeling temporal dynamics with contextualized self-attention CSRM [159] RNN, ATT CF (I) Collaborative information in session-based recommendation DASL [160] GRU, ATT CA Cross-domain data recommendation with dual attention CAML [161] GRU, ATT, FM CE Enhance accuracy and explainability through cross-task learning CatDM [162] LSTM, ATT CE, CA Integration of POI categories, geographical and temporal influences ESRM-KG [163] Tran, GRU CE, CA Multi-task learning by generating keywords to extract intent SDM [164] LSTM, ATT CE, CA Dynamic preference modeling using short-and long-term behaviors SSRM [165] GRU most popularly implementing a gating mechanism such as Long Short-Term Memory (LSTM) units [170] and Gated Recurrent Units (GRU) [171], such as to solve the various challenges faced by vanilla RNNs (e.g., the vanishing gradient problem). Furthermore, recent years have seen a dramatic increase in the proposal of approaches utilizing the attention mechanism [172], [173], which was indeed first popularized in the context of recurrent networks.…”
Section: Methodsmentioning
confidence: 99%
“…3) Other RNN-based approaches Lastly, we introduce some RNN-based methods which explore different research directions. The work by [160], for instance, explores cross-domain sequential recommendations to improve CTR accuracy. The proposed Dual Attentive Sequential Learning (DASL) learns cross-domain user representations using a dual embedding strategy, which extracts latent embeddings in both domains simultaneously through metric learning.…”
Section: ) Rnns For Anonymous Session-based Recommendationmentioning
confidence: 99%
“…Cross Domain Recommendation (CDR) is proposed to handle the cold-start and data sparsity problem commonly existed in the traditional single domain recommender systems [35]. The basic assumption of CDR is that different behavioral patterns from multiple domains jointly characterize the way users interact with items [21,34]. Thus, CDR models can improve the recommendation performance of the single domain recommender systems by transferring the knowledge learnt from other domains using auxiliary information.…”
Section: Related Work 21 Cross Domain Recommendationmentioning
confidence: 99%
“…Second, although several representative CDR approaches [2,9,15,47] address the data sparsity and cold-start problem by transferring knowledge within domains, they cannot take full advantage of sequential patterns. Third, existing CDSR methods [1,4,14,18,30] reach the breakthrough of exploring sequential dependencies and modeling structure information that bridges two domains. However, these CDSR models cannot extract and incorporate both the intra-domain and the inter-domain item transitions in a dynamical and synchronous way.…”
Section: Introductionmentioning
confidence: 99%