Proceedings of the 2019 Conference of the North 2019
DOI: 10.18653/v1/n19-1182
|View full text |Cite
|
Sign up to set email alerts
|

Untitled

Abstract: Graph Convolutional Networks (GCNs) are a class of spectral clustering techniques that leverage localized convolution filters to perform supervised classification directly on graphical structures. While such methods model nodes' local pairwise importance, they lack the capability to model global importance relative to other nodes of the graph. This causes such models to miss critical information in tasks where global ranking is a key component for the task, such as in keyphrase extraction. We address this shor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1

Citation Types

0
1
0

Year Published

2020
2020
2021
2021

Publication Types

Select...
3
2

Relationship

0
5

Authors

Journals

citations
Cited by 7 publications
(1 citation statement)
references
References 13 publications
0
1
0
Order By: Relevance
“…Previously proposed models include the RNN-based sequence model (Nallapati et al, 2017), the attention-based neural encoder-decoder model (Cheng and Lapata, 2016), and the sequence model with a global learning objective (e.g., ROUGE) for ranking sentences optimized via RL (Narayan et al, 2018;Paulus et al, 2018). More recently, graph convolutional neural networks (GCNs) have also been adapted to allow the incorporation of global information in text summarization tasks (Sun et al, 2019;Prasad and Kan, 2019). Abstractive summarization is typically cast as a sequence-to-sequence learning problem.…”
Section: Text Summarizationmentioning
confidence: 99%
“…Previously proposed models include the RNN-based sequence model (Nallapati et al, 2017), the attention-based neural encoder-decoder model (Cheng and Lapata, 2016), and the sequence model with a global learning objective (e.g., ROUGE) for ranking sentences optimized via RL (Narayan et al, 2018;Paulus et al, 2018). More recently, graph convolutional neural networks (GCNs) have also been adapted to allow the incorporation of global information in text summarization tasks (Sun et al, 2019;Prasad and Kan, 2019). Abstractive summarization is typically cast as a sequence-to-sequence learning problem.…”
Section: Text Summarizationmentioning
confidence: 99%