2024
DOI: 10.1111/coin.12656
|View full text |Cite
|
Sign up to set email alerts
|

Utilizing passage‐level relevance and kernel pooling for enhancing BERT‐based document reranking

Min Pan,
Shuting Zhou,
Teng Li
et al.

Abstract: The pre‐trained language model (PLM) based on the Transformer encoder, namely BERT, has achieved state‐of‐the‐art results in the field of Information Retrieval. Existing BERT‐based ranking models divide documents into passages and aggregate passage‐level relevance to rank the document list. However, these common score aggregation strategies cannot capture important semantic information such as document structure and have not been extensively studied. In this article, we propose a novel kernel‐based score pooli… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 60 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?