2020 International Conference on Control, Robotics and Intelligent System 2020
DOI: 10.1145/3437802.3437832
|View full text |Cite
|
Sign up to set email alerts
|

Survey on Automatic Text Summarization and Transformer Models Applicability

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
5
0

Year Published

2022
2022
2024
2024

Publication Types

Select...
5
4
1

Relationship

0
10

Authors

Journals

citations
Cited by 22 publications
(5 citation statements)
references
References 16 publications
0
5
0
Order By: Relevance
“…Upon reviewing the literature, it is evident that many studies typically focus on various aspects of transformer models, including their architecture, efficiency, computational power, memory efficiency, and the development of fast and lightweight variants [52]. On the other hand, in other studies, various NLP applications have been explored, including visualization of transformers for NLP [53], examination of pre-training methods used in transformer models [54], usage of transformers for text summarization tasks [55], application of transformer models for detecting different sentiment levels from text-based data [56], and using transformers for extracting useful information from large datasets [57]. In our study, however, unlike the existing research, the aim was to detect the general perception in society regarding newly established/green energy systems and to assist decision-makers in policymaking.…”
Section: Deep Learning and Transformersmentioning
confidence: 99%
“…Upon reviewing the literature, it is evident that many studies typically focus on various aspects of transformer models, including their architecture, efficiency, computational power, memory efficiency, and the development of fast and lightweight variants [52]. On the other hand, in other studies, various NLP applications have been explored, including visualization of transformers for NLP [53], examination of pre-training methods used in transformer models [54], usage of transformers for text summarization tasks [55], application of transformer models for detecting different sentiment levels from text-based data [56], and using transformers for extracting useful information from large datasets [57]. In our study, however, unlike the existing research, the aim was to detect the general perception in society regarding newly established/green energy systems and to assist decision-makers in policymaking.…”
Section: Deep Learning and Transformersmentioning
confidence: 99%
“…Seq2seq models are the basis of natural language processing (NLP) systems that are frequently used to translate from one language to another [10], summarize text [11] and even map images to textual summaries of them [12]. Transformer-based models [8] are now considered the state-of-the-art in NLP applications.…”
Section: Related Workmentioning
confidence: 99%
“…Generative AI refers to models and techniques that have the ability to generate new and original content, and within this domain, LLMs specialize in generating text. An LLM such as OpenAI's GPT (Generative Pre-trained Transformer) is basically trained to generate text, or rather to answer questions with paragraphs of text (Guan et al, 2020). Once trained, it can generate complete sentences and paragraphs that are coherent and, in many cases, indistinguishable from those written by humans, simply from an initial stimulus or prompt (Madotto et al, 2021).…”
Section: What Is Generative Ai?mentioning
confidence: 99%