Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics 2019
DOI: 10.18653/v1/p19-1053
|View full text |Cite
|
Sign up to set email alerts
|

Progressive Self-Supervised Attention Learning for Aspect-Level Sentiment Analysis

Abstract: IntroductionAspect-level sentiment classification (ASC), as an indispensable task in sentiment analysis, aims at inferring the sentiment polarity of an input sentence in a certain aspect. In this regard, pre- * Equal contribution † Corresponding author 1 https://github.com/DeepLearnXMU/PSSAttention

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
55
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
2
2

Relationship

0
9

Authors

Journals

citations
Cited by 85 publications
(55 citation statements)
references
References 21 publications
0
55
0
Order By: Relevance
“…MN(+ AS) [45]: MN(+AS) is designed based on MN, the context attention weight extracts the correct / wrong prediction for each instance at each iteration, and then mask this word to continue the iteration.…”
Section: Experimental Comparison Modelsmentioning
confidence: 99%
See 1 more Smart Citation
“…MN(+ AS) [45]: MN(+AS) is designed based on MN, the context attention weight extracts the correct / wrong prediction for each instance at each iteration, and then mask this word to continue the iteration.…”
Section: Experimental Comparison Modelsmentioning
confidence: 99%
“…TNET-ATT(+AS) [45]: TNET-ATT(+AS) is designed based on TNET, it uses the same technology as MN to deal with the weight of context attention. The results of our model and baseline models on datasets Restaurant, Laptop and Twitter are shown in Table 2, we find that IRAN has better performance than most frontier models on Accuracy and F1-Measure, which verifies the effectiveness of our model.…”
Section: Experimental Comparison Modelsmentioning
confidence: 99%
“…Gu et al [19] proposed a Position-Aware Bidirectional Attention Network (PBAN), which not only concentrates on the position information of aspect terms but also mutually models the relation between aspect terms and sentence by employing bidirectional attention mechanism. Tang et al [20] addressed the problems of strong mode over-learning and weak mode under-learning in the neural network learning process and proposed a progressive self-supervised attention mechanism algorithm. They properly constrained the attention mechanism.…”
Section: Related Workmentioning
confidence: 99%
“…There are kinds of sentiment analysis tasks, such as documentlevel (Thongtan and Phienthrakul, 2019), sentence-level 4 , aspect-level (Pontiki et al, 2014;Wang et al, 2019a) and multimodal (Chen et al, 2018;Akhtar et al, 2019) sentiment analysis. For the aspect-level sentiment analysis, previous work typically apply attention mechanism (Luong et al, 2015) combining with memory network (Weston et al, 2014) or gating units to solve this task (Tang et al, 2016b;He et al, 2018a;Xue and Li, 2018;Duan et al, 2018;Tang et al, 2019;Yang et al, 2019;Bao et al, 2019), where an aspect-independent encoder is used to generate the sentence representation. In addition, some work leverage the aspect-weakly associative encoder to generate aspect-specific sentence representation (Tang et al, 2016a;Wang et al, 2016;Majumder et al, 2018).…”
Section: Related Workmentioning
confidence: 99%