The 41st International ACM SIGIR Conference on Research &Amp; Development in Information Retrieval 2018
DOI: 10.1145/3209978.3210071
|View full text |Cite
|
Sign up to set email alerts
|

Ad Click Prediction in Sequence with Long Short-Term Memory Networks

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
8
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 17 publications
0
8
0
Order By: Relevance
“…Because recurrent neural networks are able to retain memory between samples and capture relations between instances for long time steps in input data, RNN-based methods have been leveraged to model sequential dependency on click data. The authors in [14] used these networks to consider user browsing behavior for click-through prediction to deal with externalities. In this case, click on an Ad might be affected by the quality of Ads shown in the long sequence of Ads.…”
Section: Click-through Rate (Ctr) Predictionmentioning
confidence: 99%
“…Because recurrent neural networks are able to retain memory between samples and capture relations between instances for long time steps in input data, RNN-based methods have been leveraged to model sequential dependency on click data. The authors in [14] used these networks to consider user browsing behavior for click-through prediction to deal with externalities. In this case, click on an Ad might be affected by the quality of Ads shown in the long sequence of Ads.…”
Section: Click-through Rate (Ctr) Predictionmentioning
confidence: 99%
“…In the literature on advertising CTR prediction (e.g., Shan et al, 2016;Deng et al, 2018;Ling et al, 2017;Liao et al, 2014;Li et al, 2015;Zhang et al, 2017), features can be categorized into five classes, as summarized in Table 2: (1) advertising features; (2) user features; (3) context features; (4) query features; and (5) publisher features.…”
Section: Features For Ctr Predictionmentioning
confidence: 99%
“…Hochreiter et al [ 19 ] proposed that LSTM is used to solve the problem that ordinary RNN cannot solve long-term dependence and may bring gradient disappearance or gradient explosion. It has been proved that LSTM can effectively solve the problem of long sequence dependency [ 20 22 ].…”
Section: Related Workmentioning
confidence: 99%