2020
DOI: 10.1109/access.2020.3045748
|View full text |Cite
|
Sign up to set email alerts
|

Leveraging Pre-Trained Language Model for Summary Generation on Short Text

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(2 citation statements)
references
References 22 publications
(28 reference statements)
0
2
0
Order By: Relevance
“…The finetuning-based methods reduce the number of trainable parameters by finetuning a part of the model (Mo, Cho, and Shin 2020;Zhao, Cong, and Carin 2020), training additional parameters while fixing the main model (Noguchi and Harada 2019;Wang et al 2020) or decomposing the parameters (Robb et al 2020). Taking image quality and diversity into regard, earlier methods were generally not the most competitive.…”
Section: Related Workmentioning
confidence: 99%
“…The finetuning-based methods reduce the number of trainable parameters by finetuning a part of the model (Mo, Cho, and Shin 2020;Zhao, Cong, and Carin 2020), training additional parameters while fixing the main model (Noguchi and Harada 2019;Wang et al 2020) or decomposing the parameters (Robb et al 2020). Taking image quality and diversity into regard, earlier methods were generally not the most competitive.…”
Section: Related Workmentioning
confidence: 99%
“…Since DL models have proven to be effective in several areas, such as computer vision [42] or NLP fields, i.e. summarization [43] or opinion mining [44] among others, most of the proposed experiments in this section were carried out by fine-tuning the RoBERTa language model (RoBERTa-basebne 13 ) in Spanish [36], a well-known DL model. We used this language model because it obtained the best results creating the classifiers of the AL process and the pre-annotation proposed in the methodology.…”
Section: B Measuring Effectivenessmentioning
confidence: 99%