2020
DOI: 10.1007/s00521-019-04638-3
|View full text |Cite
|
Sign up to set email alerts
|

CRHASum: extractive text summarization with contextualized-representation hierarchical-attention summarization network

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
8
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
8
1

Relationship

0
9

Authors

Journals

citations
Cited by 32 publications
(8 citation statements)
references
References 23 publications
0
8
0
Order By: Relevance
“…Furthermore, LSTM makes it easier to modify the parameters of whether the GRU takes less time to train [5]. [153], [291], [155] are studies where the writers focused on the GRU-based method for summarization tasks. 4) Restricted Boltzmann Machine (RBM): A randomprobability-distributed neural network (RBM) is a neural network with random probability distributions.…”
Section: ) Deep Learning Algorithmmentioning
confidence: 99%
“…Furthermore, LSTM makes it easier to modify the parameters of whether the GRU takes less time to train [5]. [153], [291], [155] are studies where the writers focused on the GRU-based method for summarization tasks. 4) Restricted Boltzmann Machine (RBM): A randomprobability-distributed neural network (RBM) is a neural network with random probability distributions.…”
Section: ) Deep Learning Algorithmmentioning
confidence: 99%
“…The model has two levels of attention mechanism applied at word level and at sentence level. A hybrid neural extractive text summarization model known as Contextualized-Representation Hierarchical-Attention Summarization (CRHA-Sum) network is proposed in [34]. The model has ability to learn contextual semantic meaning and features relation for the purpose of text summarization.…”
Section: Multi Document Summarization (Mds)mentioning
confidence: 99%
“…Other situations require a textual summary of the content taught during the video conference as well as the main questions asked by the listeners. In this sense, Diao et al [5] perform text summarization with a neural network model using the attention mechanism to capture context information and relationships between sentences improving the performance of phrase regression for text summarization.…”
Section: Future Visionmentioning
confidence: 99%
“…music, life-style and gaming) and enabled social interaction. 5 This new video-conferencing-based age, however, still needs improvement. After some months in home-office or remote classes during the pandemic, there are a lot of facets regarding communication, ethical and user-experience issues that can be already pointed out.…”
mentioning
confidence: 99%