2022
DOI: 10.11591/eei.v11i1.3278
|View full text |Cite
|
Sign up to set email alerts
|

Extractive text summarization for scientific journal articles using long short-term memory and gated recurrent units

Abstract: Along with the increasing number of scientific publications, many scientific communities must read the entire text to get the essence of information from a journal article. This will be quite inconvenient if the scientific journal article is quite long and there are more than one journals. Motivated by this problem, encourages the need for a method of text summarization that can automatically, concisely, and accurately summarize a scientific article document. The purpose of this research is to create an extrac… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(1 citation statement)
references
References 21 publications
(21 reference statements)
0
1
0
Order By: Relevance
“…When compared with its predecessor vanilla RNN algorithm which is unable to use past information, LSTM outperforms it with its long-term memory. LSTM transforms the memory shape of cells withinside the RNN via way of means of reworking the tanh activation characteristic layer withinside the RNN right into a shape containing memory devices and gate mechanisms, pursuits to determine how to make use of and replace data saved in memory cells [21]. Now, there is a new concept of mechanism in those sequence feed-forward neural network which called bidirectional.…”
Section: Spatial Dropout 1d Layermentioning
confidence: 99%
“…When compared with its predecessor vanilla RNN algorithm which is unable to use past information, LSTM outperforms it with its long-term memory. LSTM transforms the memory shape of cells withinside the RNN via way of means of reworking the tanh activation characteristic layer withinside the RNN right into a shape containing memory devices and gate mechanisms, pursuits to determine how to make use of and replace data saved in memory cells [21]. Now, there is a new concept of mechanism in those sequence feed-forward neural network which called bidirectional.…”
Section: Spatial Dropout 1d Layermentioning
confidence: 99%