2018 IEEE 3rd Advanced Information Technology, Electronic and Automation Control Conference (IAEAC) 2018
DOI: 10.1109/iaeac.2018.8577694
|View full text |Cite
|
Sign up to set email alerts
|

Neural Attentive Personalization Model for Query Auto-Completion

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
6
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(6 citation statements)
references
References 16 publications
0
6
0
Order By: Relevance
“…One advantage of the neural language modeling architecture is that additional features can be seamlessly incorporated. For example, personalization can be modeled by incorporating user ID embeddings [11,16,17] in the network. Time aware [11] and spelling errors aware [40] models are also developed under this framework.…”
Section: Deep Learning Approach For Qacmentioning
confidence: 99%
“…One advantage of the neural language modeling architecture is that additional features can be seamlessly incorporated. For example, personalization can be modeled by incorporating user ID embeddings [11,16,17] in the network. Time aware [11] and spelling errors aware [40] models are also developed under this framework.…”
Section: Deep Learning Approach For Qacmentioning
confidence: 99%
“…Recurrent Neural Networks (RNN) [11] have also been studied for QAC. ree RNN models -session-based, personalized, and a ention based, have been proposed in [12]. Fiorini and Lu [9] use user history based features as well as time features as input to an RNN model.…”
Section: Related Workmentioning
confidence: 99%
“…To alleviate the shortcomings of the retrieve-and-rank framework, another line of research uses generative sequence models to generate potential query completions, starting from the given prefix and conditioned on previous queries [10,17,35,38]. Such models leverage powerful deep learning models that can generate novel queries as well as generalize to unseen prefixes.…”
Section: Introductionmentioning
confidence: 99%