Proceedings of the 42nd International ACM SIGIR Conference on Research and Development in Information Retrieval 2019
DOI: 10.1145/3331184.3331246
|View full text |Cite
|
Sign up to set email alerts
|

Context Attentive Document Ranking and Query Suggestion

Abstract: We present a context-aware neural ranking model to exploit users' on-task search activities and enhance retrieval performance. In particular, a two-level hierarchical recurrent neural network is introduced to learn search context representation of individual queries, search tasks, and corresponding dependency structure by jointly optimizing two companion retrieval tasks: document ranking and query suggestion. To identify variable dependency structure between search context and users' ongoing search activities,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

5
58
0
1

Year Published

2019
2019
2022
2022

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 83 publications
(64 citation statements)
references
References 49 publications
(85 reference statements)
5
58
0
1
Order By: Relevance
“…A similar trend can be observed when comparing HREDCap with its multitask variants. For all the three metrics for query reformulations, the best performing model is a multitask model -this validates the observations from [1] in our context.…”
Section: 2supporting
confidence: 84%
See 2 more Smart Citations
“…A similar trend can be observed when comparing HREDCap with its multitask variants. For all the three metrics for query reformulations, the best performing model is a multitask model -this validates the observations from [1] in our context.…”
Section: 2supporting
confidence: 84%
“…This target reformulation q reform can either be (i) the subsequent query q i+1 in the same session S, or (ii) the caption C clicked i corresponding to the clicked image I clicked i . Note that obtaining contextual query suggestions via a translation model that has learnt a mapping between successive queries within a session (i.e., (i)) has been previously proposed in our reference baseline papers [1,26]. In the current paper, we utilize a linguistically richer supervision signal, in the form of captions of clicked images (i.e., (ii)), and analyze the behavior of the different models across three high level axes -relevance, descriptiveness and diversity of generated reformulations.…”
Section: Notationmentioning
confidence: 99%
See 1 more Smart Citation
“…Ahmad et al [149] incorporated short-term history information into a neural ranking model by multi-task training of document ranking and query suggestion. Short-and long-term history have been also used by Chen et al [150] for query suggestion.…”
Section: Learning With Contextmentioning
confidence: 99%
“…Although we have seen significant progresses in multitask learning of unimodal tasks of vision [18,32] or language [26,1,35] so far, there has been only a lim-ited amount of progress in multi-task learning of visionlanguage tasks. This may be attributable to the diversity of these tasks.…”
Section: Introductionmentioning
confidence: 99%