2021
DOI: 10.1145/3419106
|View full text |Cite
|
Sign up to set email alerts
|

Neural Abstractive Text Summarization with Sequence-to-Sequence Models

Abstract: In the past few years, neural abstractive text summarization with sequence-to-sequence (seq2seq) models have gained a lot of popularity. Many interesting techniques have been proposed to improve seq2seq models, making them capable of handling different challenges, such as saliency, fluency and human readability, and generate high-quality summaries. Generally speaking, most of these techniques differ in one of these three categories: network structure, parameter inference, and decoding/generation. There are als… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
68
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 135 publications
(68 citation statements)
references
References 103 publications
0
68
0
Order By: Relevance
“…In the future, we plan to investigate its effectiveness on other generation tasks, such as code generation (Jiang et al, 2021;, summarization (Shi et al, 2021) and so on.…”
Section: Discussionmentioning
confidence: 99%
“…In the future, we plan to investigate its effectiveness on other generation tasks, such as code generation (Jiang et al, 2021;, summarization (Shi et al, 2021) and so on.…”
Section: Discussionmentioning
confidence: 99%
“…We employ an encoder-decoder architecture for the problem defined above, which is similar to most Seq2Seq models in machine translation (Vaswani et al, 2017;Zhang et al, 2018), automatic text summarization (Song et al, 2019;Shi et al, 2021), and speech recognition (Tüske et al, 2019;Hannun et al, 2019) from a high-level perspective. → y i and ← − y i ) respectively.…”
Section: The Proposed Approachmentioning
confidence: 99%
“…The research in Automatic Text Summarization is enriched with many surveys that have been conducted and published in the past years. The survey conducted by [3] and [4] are quite extensive while those conducted by [1] and [5]- [7] are centered towards extractive and abstractive summarization. This research study tends to survey the scientific literature to obtain information and knowledge about the recent research in automatic text summarization specifically abstractive summarization based on neural networks to have an understanding and familiarity with the design of stateof-the-art models in this realm.…”
Section: Introductionmentioning
confidence: 99%