2023
DOI: 10.17762/ijritcc.v11i4.6454
|View full text |Cite
|
Sign up to set email alerts
|

Abstractive Summarization with Efficient Transformer Based Approach

Abstract: One of the most significant research areas is how to make a document smaller while keeping its essential information because of the rapid proliferation of online data. This information must be summarized in order to recover meaningful knowledge in an acceptable time. Text summarization is what it's called. Extractive and abstractive text summarization are the two types of summarization. In current years, the arena of abstractive text summarization has become increasingly popular. Abstractive Text Summarization… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
0
0

Year Published

2024
2024
2024
2024

Publication Types

Select...
2

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(1 citation statement)
references
References 25 publications
0
0
0
Order By: Relevance
“…• Recurrent Neural Networks (RNNs) and Transformers: These neural network architectures are especially good for processing sequences, such as time series or text, and can identify relationships and patterns within these sequences (Karnik & Kodavade, 2023) • Recommender Systems: These systems, which often rely on collaborative or content-based filtering techniques, can infer relationships between items (such as articles, books, or movies) based on consumption or preference patterns (Zhang & Hara, 2023).…”
Section: Resultsmentioning
confidence: 99%
“…• Recurrent Neural Networks (RNNs) and Transformers: These neural network architectures are especially good for processing sequences, such as time series or text, and can identify relationships and patterns within these sequences (Karnik & Kodavade, 2023) • Recommender Systems: These systems, which often rely on collaborative or content-based filtering techniques, can infer relationships between items (such as articles, books, or movies) based on consumption or preference patterns (Zhang & Hara, 2023).…”
Section: Resultsmentioning
confidence: 99%