2020
DOI: 10.1016/j.eswa.2019.112958
|View full text |Cite
|
Sign up to set email alerts
|

Text document summarization using word embedding

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
42
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
4
1

Relationship

0
9

Authors

Journals

citations
Cited by 96 publications
(52 citation statements)
references
References 16 publications
0
42
0
Order By: Relevance
“…In this area, a major breakthrough is achieved through deep learning techniques. Several researchers have worked on Convolution neural network (CNN) [16], Recurrent neural network (RNN) [17], Reinforcement learning and Generative Adversarial networks (GAN) [18] with high accuracy as compared to other approaches used in literature. The major shortcoming of applying deep learning methods is the un-availability of training data as it is a supervised approach and standard golden summaries are not available in every domain.…”
Section: To Oversee Diverse Number Of Bug Reports Several Automationmentioning
confidence: 99%
“…In this area, a major breakthrough is achieved through deep learning techniques. Several researchers have worked on Convolution neural network (CNN) [16], Recurrent neural network (RNN) [17], Reinforcement learning and Generative Adversarial networks (GAN) [18] with high accuracy as compared to other approaches used in literature. The major shortcoming of applying deep learning methods is the un-availability of training data as it is a supervised approach and standard golden summaries are not available in every domain.…”
Section: To Oversee Diverse Number Of Bug Reports Several Automationmentioning
confidence: 99%
“…X X ! Mainly focused on text cleansing with no corpus processing reduction Extractive summarization based semantic framework [10] ! X !…”
Section: Concept-based Abstractive and Extractive Text Mining Implementmentioning
confidence: 99%
“…After the metadata processing, extractive summarization reduces the corpus by removing less necessary text [3], [10]. The potential sentences are either selected through the sentence score algorithm or word embedding principles, i.e., Word2Vec model [11], [18], [29].…”
Section: Introductionmentioning
confidence: 99%
“…Others have intrinsically evaluated word embeddings by clustering biomedical terms from the Unified Medical Language System and Ranker [ 21 ], and assessing the cluster quality using metrics like the Davies-Bouldin index and the Dunn index. Word embeddings have advanced the state of the art for many intrinsic natural language processing subtasks (ie, reading comprehension [ 22 ], natural language inference [ 23 ], text summarization [ 24 ], vocabulary development [ 8 ], and document classification [ 25 ]). An extrinsic or summative evaluation of clinical word embeddings can involve evaluating the performance of machine learning models by using word embeddings to complete a biomedical research task or clinical operation such as patient phenotyping [ 26 , 27 ], patient fall prediction [ 25 ], and patient hospital readmission prediction [ 28 ].…”
Section: Introductionmentioning
confidence: 99%