Proceedings of the 28th International Conference on Computational Linguistics 2020
DOI: 10.18653/v1/2020.coling-main.69
|View full text |Cite
|
Sign up to set email alerts
|

Syntax-Aware Graph Attention Network for Aspect-Level Sentiment Classification

Abstract: Aspect-level sentiment classification aims to distinguish the sentiment polarities over aspect terms in a sentence. Existing approaches mostly focus on modeling the relationship between the given aspect words and their contexts with attention, and ignore the use of more elaborate knowledge implicit in the context. In this paper, we exploit syntactic awareness to the model by the graph attention network on the dependency tree structure and external pre-training knowledge by BERT language model, which helps to m… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
26
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 62 publications
(40 citation statements)
references
References 28 publications
0
26
0
Order By: Relevance
“…Graph-based Methods. A number of recent works employ graph neural networks such as graph convolutional netowrks (GCN) [4], [6], [6], [9], [19] and graph attention networks (GAT) [3], [7], [20] to encode the syntax graph predicted by off-the-shelf dependency parsers. [1] employ a GCN to capture syntactic features and facilitate the information exchange between the aspect and its related context words.…”
Section: Related Workmentioning
confidence: 99%
“…Graph-based Methods. A number of recent works employ graph neural networks such as graph convolutional netowrks (GCN) [4], [6], [6], [9], [19] and graph attention networks (GAT) [3], [7], [20] to encode the syntax graph predicted by off-the-shelf dependency parsers. [1] employ a GCN to capture syntactic features and facilitate the information exchange between the aspect and its related context words.…”
Section: Related Workmentioning
confidence: 99%
“…Graph Neural Networks (GNNs) have been used to address many problems that are inherently graph-like such as traffic networks, social networks, and physical and biological systems (Liu and Zhou, 2020). GNNs achieve impressive performance in many domains, including social networks (Wu et al, 2020) and natural science (Sanchez-Gonzalez et al, 2018) as well as NLP tasks like sentence classification (Huang et al, 2020), question generation (Pan et al, 2020), summarization (Fernandes et al, 2019) and derivational morphology (Hofmann et al, 2020). 7 github.com/robertostling/eflomal…”
Section: Annotation Projectionmentioning
confidence: 99%
“…• SAGAT [14] utilizes graph attention network and BERT to fully obtain both syntax and semantic information. • KGCapsAN-BERT [48] utilizes multi-prior knowledge to guide the capsule attention process and use a GCN-based syntactic layer to integrate the syntactic knowledge.…”
Section: Compared Baselinesmentioning
confidence: 99%
“…E-mail: Bowen.Xing@student.uts.edu.au, Ivor.Tsang@uts.edu.au between the aspect and its related words that are distant in context sequence. Graph Convolutional Networks (GCN) [12], [13] and Graph Attention Networks (GAT) [14], [15] are adopted to encode the syntax graphs predicted by off-the-shelf dependency parsers. [12], [16] employed GCNs to capture the local syntactic information.…”
Section: Introductionmentioning
confidence: 99%