2022
DOI: 10.1016/j.knosys.2021.108108
|View full text |Cite
|
Sign up to set email alerts
|

STV-BEATS: Skip Thought Vector and Bi-Encoder based Automatic Text Summarizer

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
3
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 7 publications
(3 citation statements)
references
References 26 publications
0
3
0
Order By: Relevance
“…Srivastava et al (2022) investigated an unsupervised extractive summarization approach that combined clustering with topic modelling to reduce topic bias. Tomer and Kumar (2022) proposed the STV-BEATS summarization framework, which uses skip-thought vectors to generate sentence-based embeddings and a long short-term memory (LSTM)-based deep autoencoder to reduce the dimensionality of the skip-thought vectors.…”
Section: Related Work 21 Text Summarization Approachesmentioning
confidence: 99%
“…Srivastava et al (2022) investigated an unsupervised extractive summarization approach that combined clustering with topic modelling to reduce topic bias. Tomer and Kumar (2022) proposed the STV-BEATS summarization framework, which uses skip-thought vectors to generate sentence-based embeddings and a long short-term memory (LSTM)-based deep autoencoder to reduce the dimensionality of the skip-thought vectors.…”
Section: Related Work 21 Text Summarization Approachesmentioning
confidence: 99%
“…Skip-thought expands the skipgrams model beyond word embeddings to encompass sentence embeddings. Instead of forecasting context based on adjacent words, skip-thought anticipates the target sentence by considering both the preceding and subsequent sentences [15]. The global vector (GloVe) on the otherhand incorporates a global matrix factorization and local context window techniques through a bilinear regression model.…”
Section: What Are Embeddings and How Are They Generated?mentioning
confidence: 99%
“…The automated generation of simplified summaries of scholarly articles is a popular mechanism for the public dissemination of scientific discoveries (Hou et al, 2022 ). This task can be performed by leveraging text summarization (Tomer and Kumar, 2022 ) and text simplification (Al-Thanyyan and Azmi, 2021 ), which are well-defined tasks in Natural Language Processing (NLP) (Iqbal et al, 2021 ). Text summarization is the task of reducing text size while maintaining the information presented in the text.…”
Section: Introductionmentioning
confidence: 99%