Proceedings of the 2021 Conference on Empirical Methods in Natural Language Processing 2021
DOI: 10.18653/v1/2021.emnlp-main.11
|View full text |Cite
|
Sign up to set email alerts
|

Multiplex Graph Neural Network for Extractive Text Summarization

Abstract: Extractive text summarization aims at extracting the most representative sentences from a given document as its summary. To extract a good summary from a long text document, sentence embedding plays an important role. Recent studies have leveraged graph neural networks to capture the inter-sentential relationship (e.g., the discourse graph) to learn contextual sentence embedding. However, those approaches neither consider multiple types of inter-sentential relationships (e.g., semantic similarity & natural con… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
9
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
5
2
1

Relationship

0
8

Authors

Journals

citations
Cited by 22 publications
(9 citation statements)
references
References 42 publications
0
9
0
Order By: Relevance
“…Recent years have witnessed remarkable success in graph neural network (GNN) for natural language understanding tasks. Specifically, GNN shows superior performance on text summarization task (Feng et al 2021;Jing et al 2021), sentiment analysis task (Huang et al 2019;Liang et al 2022), dialogue understanding task (Qin et al 2021a,b) and language modeling (Meng et al 2022). Inspired by the above work, we explore GNN for better capturing relationship across multiple KBs in EToDs.…”
Section: Graph Neural Network For Nlpmentioning
confidence: 98%
“…Recent years have witnessed remarkable success in graph neural network (GNN) for natural language understanding tasks. Specifically, GNN shows superior performance on text summarization task (Feng et al 2021;Jing et al 2021), sentiment analysis task (Huang et al 2019;Liang et al 2022), dialogue understanding task (Qin et al 2021a,b) and language modeling (Meng et al 2022). Inspired by the above work, we explore GNN for better capturing relationship across multiple KBs in EToDs.…”
Section: Graph Neural Network For Nlpmentioning
confidence: 98%
“…The algorithms used in previous studies include sentence compression [29], sentence fusion [30,31], and sentence revision [32]. However, encoder-decoder architectures are commonly used at present [16][17][18][19]. In the medical field, extractive summarization methods are commonly used for knowledge acquisition of clinical features, such as diseases, prescriptions, and examinations.…”
Section: Related Workmentioning
confidence: 99%
“…Because abstractive summarization can generate more flexible summaries, it has become a major approach in automatic summarization research [14][15][16][17][18][19]. However, abstractive summarization may sometimes unintentionally generate unfaithful descriptions known as hallucinations.…”
Section: Plos Digital Healthmentioning
confidence: 99%
See 1 more Smart Citation
“…Early studies, such as unsupervised LexRank [2] and TextRank [13], built similarity graphs among sentences leveraged PageRank [16] to rank them by estimating summary-worthy features of sentence importance. Recently, some works have applied graph representation learning techniques on various semantic graphs [3,5,6,17,28,32] with consideration of semantic similarity and the natural topology. However, they usually rely on external tools to construct the graphs, in which the error propagation problem is serious.…”
Section: Introductionmentioning
confidence: 99%