2020
DOI: 10.48550/arxiv.2005.11988
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Deep Learning Models for Automatic Summarization

Pirmin Lemberger

Abstract: Text summarization is an NLP task which aims to convert a textual document into a shorter one while keeping as much meaning as possible. This pedagogical article reviews a number of recent Deep Learning architectures that have helped to advance research in this field. We will discuss in particular applications of pointer networks, hierarchical Transformers and Reinforcement Learning. We assume basic knowledge of Seq2Seq architecture and Transformer networks within NLP.

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2022
2022

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(1 citation statement)
references
References 6 publications
0
1
0
Order By: Relevance
“…Zhang et al [39] proposed HIBERT for document encoding, designed a method for pre-training it for document modeling using unlabeled data and then applied their pre-trained HIBERT to document summarization to achieve state-of-the-art performance on both CNN/Dailymail and New York Times datasets. Lemberger et al [40] reviewed several deep learning architectures for automatic text summarization. Our approach is semantically similar to [18], [22] and [23], but the differences lie in our contributions, as listed in Section III.…”
Section: Related Workmentioning
confidence: 99%
“…Zhang et al [39] proposed HIBERT for document encoding, designed a method for pre-training it for document modeling using unlabeled data and then applied their pre-trained HIBERT to document summarization to achieve state-of-the-art performance on both CNN/Dailymail and New York Times datasets. Lemberger et al [40] reviewed several deep learning architectures for automatic text summarization. Our approach is semantically similar to [18], [22] and [23], but the differences lie in our contributions, as listed in Section III.…”
Section: Related Workmentioning
confidence: 99%