2021
DOI: 10.48550/arxiv.2105.05727
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

BertGCN: Transductive Text Classification by Combining GCN and BERT

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
19
0

Year Published

2021
2021
2023
2023

Publication Types

Select...
5
4

Relationship

3
6

Authors

Journals

citations
Cited by 24 publications
(21 citation statements)
references
References 25 publications
0
19
0
Order By: Relevance
“…We first compare the proposed method with several state-ofthe-art methods on the text classification task, including HAN [28], TextGCN [30], XLNet [6], BertGCN [37]. Micro-F1 and Weighted-F1 are adopted as evaluation metrics.…”
Section: Event Text Classification Resultsmentioning
confidence: 99%
“…We first compare the proposed method with several state-ofthe-art methods on the text classification task, including HAN [28], TextGCN [30], XLNet [6], BertGCN [37]. Micro-F1 and Weighted-F1 are adopted as evaluation metrics.…”
Section: Event Text Classification Resultsmentioning
confidence: 99%
“…trieve a set of relevant information to improve the model performance under the merit that an openbook exam is easier than a close-book exam. Recent success on various NLP tasks has shown the effectiveness of retrieval augmented models in improving the quality of neural NLP models, such as language modeling (Khandelwal et al, 2019;, question answering (Guu et al, 2020;Lewis et al, 2020a,b;Xiong et al, 2020), text classification (Lin et al, 2021b), dialog generation Thulke et al, 2021;Weston et al, 2018) and neural machine translation (Khandelwal et al, 2019;Meng et al, 2021a;Wang et al, 2021).…”
Section: Related Workmentioning
confidence: 99%
“…Graph attention network (GAT) [43] leverages masked self-attentional layers to address the shortcomings of prior work based on graph convolutions or their approximations. GNN has demonstrated effectiveness in a wide variety of tasks such as text classification [24], question answering [6], recommendation [44] and relation extraction [23]. For example, Guo et al [9] presents Star-Transformer which reduces the computation complexity of the standard Transformer by replacing the fully-connected structure in self-attention with a star-like topology.…”
Section: Graph Neural Networkmentioning
confidence: 99%