Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics 2019
DOI: 10.18653/v1/p19-1099
|View full text |Cite
|
Sign up to set email alerts
|

Global Optimization under Length Constraint for Neural Text Summarization

Abstract: We propose a global optimization method under length constraint (GOLC) for neural text summarization models. GOLC increases the probabilities of generating summaries that have high evaluation scores, ROUGE in this paper, within a desired length. We compared GOLC with two optimization methods, a maximum log-likelihood and a minimum risk training, on CNN/Daily Mail and a Japanese single document summarization data set of The Mainichi Shimbun Newspapers. The experimental results show that a state-ofthe-art neural… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
15
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
6
2
2

Relationship

1
9

Authors

Journals

citations
Cited by 33 publications
(16 citation statements)
references
References 24 publications
0
15
0
Order By: Relevance
“…In this paper, we present a two-stage strategy to over-generate, then score system summaries externally for faithfulness and overall quality. Previous work has sought to control various aspects of the generated summary, including the style, length and amount of reused text (Kikuchi et al, 2016;Hu et al, 2017;Fan et al, 2018;Makino et al, 2019;Song et al, 2020). In contrast, our generator focuses on producing multiple variants of the target summary that have diverse content and varying lengths.…”
Section: Related Workmentioning
confidence: 99%
“…In this paper, we present a two-stage strategy to over-generate, then score system summaries externally for faithfulness and overall quality. Previous work has sought to control various aspects of the generated summary, including the style, length and amount of reused text (Kikuchi et al, 2016;Hu et al, 2017;Fan et al, 2018;Makino et al, 2019;Song et al, 2020). In contrast, our generator focuses on producing multiple variants of the target summary that have diverse content and varying lengths.…”
Section: Related Workmentioning
confidence: 99%
“…Furthermore, unlike most previous methods that only control a single binary attribute (e.g., positive and negative sentiments), our approach can further control multiple fine-grained attributes such as sentence length and the existence of specific words. Note that controlling such fine-grained attributes has already been studied in the previous works for other tasks (Post and Vilar 2018;Makino et al 2019), which only serves as a case study to demonstrate the generality of our method.…”
Section: Related Workmentioning
confidence: 99%
“…More recently, Takase and Okazaki (2019) modified the positional encoding from the Transformer (Vaswani et al, 2017) to encode allowable lengths. Makino et al (2019) proposed a loss function that encourages summaries within desired lengths. Saito et al (2020) introduced a model that controls both output length and informativeness.…”
Section: Length-controlled Text Generationmentioning
confidence: 99%