2020
DOI: 10.1609/aaai.v34i05.6420
|View full text |Cite
|
Sign up to set email alerts
|

Controlling the Amount of Verbatim Copying in Abstractive Summarization

Abstract: An abstract must not change the meaning of the original text. A single most effective way to achieve that is to increase the amount of copying while still allowing for text abstraction. Human editors can usually exercise control over copying, resulting in summaries that are more extractive than abstractive, or vice versa. However, it remains poorly understood whether modern neural abstractive summarizers can provide the same flexibility, i.e., learning from single reference summaries to generate multiple summa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
28
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 43 publications
(34 citation statements)
references
References 27 publications
0
28
0
Order By: Relevance
“…Abstractive Summarization. The majority of research in abstractive summarization has focused on monolingual summarization in English (Gehrmann et al, 2018;Song et al, 2020;Narayan et al, 2018). Rush et al (2015) proposes the first neural abstractive summarization model using an attentionbased convolutional neural network encoder and a feed-forward decoder.…”
Section: Related Workmentioning
confidence: 99%
See 2 more Smart Citations
“…Abstractive Summarization. The majority of research in abstractive summarization has focused on monolingual summarization in English (Gehrmann et al, 2018;Song et al, 2020;Narayan et al, 2018). Rush et al (2015) proposes the first neural abstractive summarization model using an attentionbased convolutional neural network encoder and a feed-forward decoder.…”
Section: Related Workmentioning
confidence: 99%
“…They further train the extractor and abstractor end-to-end with a policygradient method, using ROUGE-L F1 as the reward function. Recently, pre-trained language models have achieved the state of the art results in abstractive summarization (Lewis et al, 2019b;Liu and Lapata, 2019;Song et al, 2020). Therefore, we use mBART for all the baselines and our direct cross-lingual models.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…However, they are still quite limited (Jain and Wallace, 2019) in consistently explaining all aspects of a neural summarizer. This leaves a gap in the ongoing efforts (Song et al, 2020a;Song et al, 2020b) to generate abstractive summaries that are guided by human-interpretable semantic/syntactic qualities. Briefly, the main goal of attention mechanism in a encoder-decoder network is to assign a softmax score to every encoder hidden state (based on its relevance to the token being decoded) and amplify those that are assigned high scores through a weighted average.…”
Section: Prototype Summarymentioning
confidence: 99%
“…Recent years have witnessed the success in abstractive summarization using encoder-decoder framework with sequence-to-sequence models (Rush et al, 2015;Nallapati et al, 2016;See et al, 2017;Celikyilmaz et al, 2018). The encoder which is leveraged for syntactic compression can be implemented using recurrent neural networks (Chopra et al, 2016;Tan et al, 2017;Chen and Bansal, 2018), convolutional networks (Allamanis et al, 2016;Liu et al, 2018) and transformerbased methods (Devlin et al, 2019;Song et al, 2020b). To handle the problem that many OOV words are generated by vanilla sequence-to-sequence decoder, copy mechanism is proposed to copy a word from the source text or select an unseen word from the vocabulary (See et al, 2017;Zhou et al, 2018;.…”
Section: Related Workmentioning
confidence: 99%