Proceedings of the 30th ACM International Conference on Information &Amp; Knowledge Management 2021
DOI: 10.1145/3459637.3482124
|View full text |Cite
|
Sign up to set email alerts
|

Improving Query Representations for Dense Retrieval with Pseudo Relevance Feedback

Abstract: Dense retrieval systems conduct first-stage retrieval using embedded representations and simple similarity metrics to match a query to documents. Its effectiveness depends on encoded embeddings to capture the semantics of queries and documents, a challenging task due to the shortness and ambiguity of search queries. This paper proposes ANCE-PRF, a new query encoder that uses pseudo relevance feedback (PRF) to improve query representations for dense retrieval. ANCE-PRF uses a BERT encoder that consumes the quer… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
32
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 32 publications
(38 citation statements)
references
References 28 publications
(24 reference statements)
0
32
0
Order By: Relevance
“…Dense retrieval has made great progress in recent years. Since dense retrievers [23,24,56] use embedding vectors to represent queries and documents, a few methods [28,29,54,59] have been studied to integrate pseudo-relevance information into reformulated query vectors. ColBERT-PRF [54] first verified the effectiveness of PRF in multi-representation dense retrieval [24].…”
Section: Prf For Dense Retrievalmentioning
confidence: 99%
See 4 more Smart Citations
“…Dense retrieval has made great progress in recent years. Since dense retrievers [23,24,56] use embedding vectors to represent queries and documents, a few methods [28,29,54,59] have been studied to integrate pseudo-relevance information into reformulated query vectors. ColBERT-PRF [54] first verified the effectiveness of PRF in multi-representation dense retrieval [24].…”
Section: Prf For Dense Retrievalmentioning
confidence: 99%
“…[28] investigated two simple methods, Average and Rocchio [50], to utilize feedback documents in single-representation dense retrievers (e.g., ANCE [56]) without introducing new neural models or further training. Instead of refining the query vector heuristically, ANCE-PRF [59] uses RoBERTa [32] to consume the original query and the topretrieved documents from ANCE [56]. Keeping the document index unchanged, ANCE-PRF is trained end-to-end with relevance labels and learns to optimize the query vector by exploiting the relevant information in the feedback documents.…”
Section: Prf For Dense Retrievalmentioning
confidence: 99%
See 3 more Smart Citations