Proceedings of the 24th ACM SIGKDD International Conference on Knowledge Discovery &Amp; Data Mining 2018
DOI: 10.1145/3219819.3219956
|View full text |Cite
|
Sign up to set email alerts
|

Learning Adversarial Networks for Semi-Supervised Text Classification via Policy Gradient

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
16
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 43 publications
(17 citation statements)
references
References 14 publications
1
16
0
Order By: Relevance
“…Semi-supervised Methods. Many semi-supervised methods have been explored on sentence-level sentiment classification, such as pretraining with Restricted Boltzmann Machine or autoencoder [23,26], auxiliary task learning [24] and adversarial training [25,27]. However, there are only few studies [16,19] on semi-supervised target-level sentiment classification.…”
Section: Deep Methodsmentioning
confidence: 99%
“…Semi-supervised Methods. Many semi-supervised methods have been explored on sentence-level sentiment classification, such as pretraining with Restricted Boltzmann Machine or autoencoder [23,26], auxiliary task learning [24] and adversarial training [25,27]. However, there are only few studies [16,19] on semi-supervised target-level sentiment classification.…”
Section: Deep Methodsmentioning
confidence: 99%
“…Recently, generative adversarial networks (GANs) [10] have demonstrated its superior performance in many tasks [21,37]. A few methods leverage the adversarial training to learn more robust network representation [8,12,36,48].…”
Section: Anomaly Detection and Robust Representation Learning On Graphsmentioning
confidence: 99%
“…Hence, VAE initially achieves great success in generating the latent representation from high dimensional data. Recently, semi-supervised methods with generative models have attracted more attention, which can achieve a better result compared with simple supervised methods [20], [29]- [31]. Xu et al propose a semi-supervised sequential variational autoencoder framework by using conditional RNN for text classification.…”
Section: B Generative Modelmentioning
confidence: 99%