2021
DOI: 10.1007/978-981-16-0401-0_15
|View full text |Cite
|
Sign up to set email alerts
|

NEWS Article Summarization with Pretrained Transformer

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
2
2
1

Relationship

0
5

Authors

Journals

citations
Cited by 6 publications
(2 citation statements)
references
References 9 publications
0
2
0
Order By: Relevance
“…Kerui [42] uses BERT, Seq2seq and reinforcement learning to form a text summary model. Garg [43] uses T5, one of the most advanced pre-trained models, to perform a summary task on a data set with 80,000 news articles, and the results indicate that the summary generated by T5 has better quality than those generated by other models. Daiya [44] has developed a pre-trained language model ENEMAbst that can be used in extractive and abstractive summarization techniques.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Kerui [42] uses BERT, Seq2seq and reinforcement learning to form a text summary model. Garg [43] uses T5, one of the most advanced pre-trained models, to perform a summary task on a data set with 80,000 news articles, and the results indicate that the summary generated by T5 has better quality than those generated by other models. Daiya [44] has developed a pre-trained language model ENEMAbst that can be used in extractive and abstractive summarization techniques.…”
Section: Related Workmentioning
confidence: 99%
“…Furthermore, due to our task requires extensive domain knowledge beyond the meaning of individual words and sentences, in this work we limit our focus to abstractive summarization algorithms. On the other hand, in the recent references of text summarization, there are more and more work using pre-training language models [35][36][37][38][39][40][41][42][43][44]. Among them, BERT is the pre-trained model mainly used by most research teams.…”
Section: Related Workmentioning
confidence: 99%