Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing (EMNLP) 2020
DOI: 10.18653/v1/2020.emnlp-main.136
|View full text |Cite
|
Sign up to set email alerts
|

Multi-document Summarization with Maximal Marginal Relevance-guided Reinforcement Learning

Abstract: While neural sequence learning methods have made significant progress in single-document summarization (SDS), they produce unsatisfactory results on multi-document summarization (MDS). We observe two major challenges when adapting SDS advances to MDS: ( 1) MDS involves larger search space and yet more limited training data, setting obstacles for neural methods to learn adequate representations; (2) MDS needs to resolve higher information redundancy among the source documents, which SDS methods are less effecti… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

0
17
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 28 publications
(19 citation statements)
references
References 33 publications
0
17
0
Order By: Relevance
“…BM25 and query likelihood (QL) are the most popular adhoc retrieval models for the first stage (Nogueira et al, 2019;Boudin et al, 2020;Tang and Arnold, 2020). More recently, instead of using sparse models, methods of using dense representations have been proposed (Karpukhin et al, 2020;Xiong et al, 2020;Qu et al, 2020), which can help alleviate the vocabulary mismatch problem through a dense representation space. However, recent work has revealed the limitations on their performance and efficiency (Lin, 2019;Xiong et al, 2020;Luan et al, 2020).…”
Section: Related Workmentioning
confidence: 99%
See 3 more Smart Citations
“…BM25 and query likelihood (QL) are the most popular adhoc retrieval models for the first stage (Nogueira et al, 2019;Boudin et al, 2020;Tang and Arnold, 2020). More recently, instead of using sparse models, methods of using dense representations have been proposed (Karpukhin et al, 2020;Xiong et al, 2020;Qu et al, 2020), which can help alleviate the vocabulary mismatch problem through a dense representation space. However, recent work has revealed the limitations on their performance and efficiency (Lin, 2019;Xiong et al, 2020;Luan et al, 2020).…”
Section: Related Workmentioning
confidence: 99%
“…Query / Document Expansion Query and document expansions have been widely used in IR systems. In terms of query expansion, Jaleel et al (2004) proposed pseudo relevance feedback (RM3), which is revisited in more recent work (Dibia, 2020;Mao et al, 2020) for its strength. There are also methods that expand queries using generation schemes (Mao et al, 2020;Claveau, 2020).…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Inspired by PageRank algorithm (Page et al, 1999), they consider the document as a graph where sentences are vertices and edges represent the relations between two sentences. Shortly thereafter, some researchers (Carbonell and Goldstein, 1998;Kurmi and Jain, 2014;Mao et al, 2020) involved a query-biased strategy, the Maximal Marginal Relevance (MMR) (Carbonell and Goldstein, 1998), in their summarizers. MMR tries to balance the relevance and diversity by controlling the trade-off parameter λ.…”
Section: Related Workmentioning
confidence: 99%