2021
DOI: 10.48550/arxiv.2101.07918
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

PGT: Pseudo Relevance Feedback Using a Graph-Based Transformer

Abstract: Most research on pseudo relevance feedback (PRF) has been done in vector space and probabilistic retrieval models. This paper shows that Transformer-based rerankers can also benefit from the extra context that PRF provides. It presents PGT, a graph-based Transformer that sparsifies attention between graph nodes to enable PRF while avoiding the high computational complexity of most Transformer architectures. Experiments show that PGT improves upon non-PRF Transformer reranker, and it is at least as accurate as … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 20 publications
(25 reference statements)
0
1
0
Order By: Relevance
“…Pan et al 18 integrated kernel co-occurrence information into Rocchio and RM3 and proposed KRoc and KRM3 to achieve improved retrieval performance. Recently, Yu et al 19 proposed PGT, a feedback method based on transformer architecture. In PGT, pseudo-relevance feedback is performed by feeding each feedback document and the target document into BERT in series.…”
Section: Related Workmentioning
confidence: 99%
“…Pan et al 18 integrated kernel co-occurrence information into Rocchio and RM3 and proposed KRoc and KRM3 to achieve improved retrieval performance. Recently, Yu et al 19 proposed PGT, a feedback method based on transformer architecture. In PGT, pseudo-relevance feedback is performed by feeding each feedback document and the target document into BERT in series.…”
Section: Related Workmentioning
confidence: 99%