Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing 2021
DOI: 10.18653/v1/2021.emnlp-main.297
|View full text |Cite
|
Sign up to set email alerts
|

Phrase Retrieval Learns Passage Retrieval, Too

Abstract: Dense retrieval methods have shown great promise over sparse retrieval methods in a range of NLP problems. Among them, dense phrase retrieval-the most fine-grained retrieval unit-is appealing because phrases can be directly used as the output for question answering and slot filling tasks. 1 In this work, we follow the intuition that retrieving phrases naturally entails retrieving larger text blocks and study whether phrase retrieval can serve as the basis for coarse-level retrieval including passages and docu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
6
3
1

Relationship

0
10

Authors

Journals

citations
Cited by 11 publications
(6 citation statements)
references
References 17 publications
0
6
0
Order By: Relevance
“…We observe that our retrieval result with small retrieved documents (i.e., K = 10) significantly improves the performance of the reader. This implies that a more accurate retrieval on smaller K in Table 1 helps achieve the improved QA performance as Lee et al (2021a) described. Furthermore, our reader performance may be further enhanced with advanced reading schemes (Mao et al, 2021a;Mao et al, 2021b).…”
Section: Effectiveness Of Interpolation and Perturbationmentioning
confidence: 83%
“…We observe that our retrieval result with small retrieved documents (i.e., K = 10) significantly improves the performance of the reader. This implies that a more accurate retrieval on smaller K in Table 1 helps achieve the improved QA performance as Lee et al (2021a) described. Furthermore, our reader performance may be further enhanced with advanced reading schemes (Mao et al, 2021a;Mao et al, 2021b).…”
Section: Effectiveness Of Interpolation and Perturbationmentioning
confidence: 83%
“…Phrase-based retrieval (Seo et al, 2018(Seo et al, , 2019 eliminates the need for a reader during inference, as it directly retrieves the answer span given a query. Lee et al (2021a) demonstrated strong end-to-end ODQA results with this approach, and Lee et al (2021b) showed that it is also effective for passage retrieval. Our pretraining scheme can be seamlessly used for those architectures as well.…”
Section: Related Workmentioning
confidence: 93%
“…As a baseline for recent work, dense passage retriever (DPR) [32] uses a sophisticated sample mining and training approach to enable the potential of the dual-encoder retriever architecture, that is, using two independent BERTs to encode the question and the context separately and calculate the similarity between the two to select relevant passages. On the basis of this architecture, subsequent work includeed improvements in the calculation of similarity [33], [62], developing highly efficient training methods [60], [63], [66], and hierarchizing the retrieval process [40], [49]. However, such models act like a "black box," and it is difficult to understand what knowledge has been accurately stored and how the models internally filter and process passages and output the final results.…”
Section: Transformers In Openqamentioning
confidence: 99%