Proceedings of the 23rd Conference on Computational Natural Language Learning (CoNLL) 2019
DOI: 10.18653/v1/k19-1090
|View full text |Cite
|
Sign up to set email alerts
|

Variational Semi-Supervised Aspect-Term Sentiment Analysis via Transformer

Abstract: Aspect-term sentiment analysis (ATSA) is a long-standing challenge in natural language processing. It requires fine-grained semantical reasoning about a target entity appeared in the text. As manual annotation over the aspects is laborious and time-consuming, the amount of labeled data is limited for supervised learning. This paper proposes a semisupervised method for the ATSA problem by using the Variational Autoencoder based on Transformer. The model learns the latent distribution via variational inference. … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
23
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 24 publications
(24 citation statements)
references
References 20 publications
(21 reference statements)
0
23
0
Order By: Relevance
“…Many semi-supervised methods have been explored on sentence-level sentiment classification, such as pretraining with Restricted Boltzmann Machine or autoencoder [23,26], auxiliary task learning [24] and adversarial training [25,27]. However, there are only few studies [16,19] on semi-supervised target-level sentiment classification. [19] explored both pretraining and multi-task learning for transferring knowledge from document-level data, which is much less expensive to obtain.…”
Section: Deep Methodsmentioning
confidence: 99%
See 2 more Smart Citations
“…Many semi-supervised methods have been explored on sentence-level sentiment classification, such as pretraining with Restricted Boltzmann Machine or autoencoder [23,26], auxiliary task learning [24] and adversarial training [25,27]. However, there are only few studies [16,19] on semi-supervised target-level sentiment classification. [19] explored both pretraining and multi-task learning for transferring knowledge from document-level data, which is much less expensive to obtain.…”
Section: Deep Methodsmentioning
confidence: 99%
“…[19] explored both pretraining and multi-task learning for transferring knowledge from document-level data, which is much less expensive to obtain. [16] used a Transformer-based VAE for pretraining, which modeled the latent distributions via variational inference. However, it failed to distinguish the relevant and irrelevant features with respect to the sentiment.…”
Section: Deep Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…To address this problem, Ma et al [36] propose a novel solution to targeted aspect-based sentiment analysis, which tackles the challenges of both aspect-based sentiment analysis and targeted sentiment analysis by exploiting commonsense knowledge. Cheng et al [37] proposes a semi-supervised method for the ATSA problem by using the Variational Autoencoder based on Transformer. The model learns the latent distribution via variational inference.…”
Section: Related Workmentioning
confidence: 99%
“…where Since the self-attention mechanism in Transformer is able to draw global dependencies between input and output without regard to their distances in the sequence, Transformer is more suitable for capturing long-distance dependencies and allows more parallelization compared with RNN. It has been demonstrated that the Transformer is more effective than RNN in a wide range of NLP tasks [74][75][76][77][78]. Especially, the emerging of pre-trained language models based on the Transformer has lead to a series of breakthroughs and state-of-the-art performances in many NLP applications [3,46,79,80].…”
Section: Transformermentioning
confidence: 99%