Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics 2020
DOI: 10.18653/v1/2020.acl-main.227
|View full text |Cite
|
Sign up to set email alerts
|

Improving Adversarial Text Generation by Modeling the Distant Future

Abstract: Auto-regressive text generation models usually focus on local fluency, and may cause inconsistent semantic meaning in long text generation. Further, automatically generating words with similar semantics is challenging, and hand-crafted linguistic rules are difficult to apply. We consider a text planning scheme and present a model-based imitation-learning approach to alleviate the aforementioned issues. Specifically, we propose a novel guider network to focus on the generative process over a longer horizon, whi… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
15
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
6
3

Relationship

1
8

Authors

Journals

citations
Cited by 26 publications
(15 citation statements)
references
References 22 publications
0
15
0
Order By: Relevance
“…Based on GAN, many algorithms have been developed, such as conditional GAN (Mirza & Osindero, ), StackGAN (Zhang et al, ), and GP‐GAN (Wu, Zheng, Zhang, & Huang, ). In particular, adversarial training has been used for generating realistic text (Zhang et al, ).…”
Section: Deep Learningmentioning
confidence: 99%
“…Based on GAN, many algorithms have been developed, such as conditional GAN (Mirza & Osindero, ), StackGAN (Zhang et al, ), and GP‐GAN (Wu, Zheng, Zhang, & Huang, ). In particular, adversarial training has been used for generating realistic text (Zhang et al, ).…”
Section: Deep Learningmentioning
confidence: 99%
“…The model is tested on four different datasets including Yelp, Amazon, Film, Obama Speech and the reported results showed that it outperforms all three baseline methods used for comparison (Markov Chain, RNN and Seq2Seq). A similar text generation approach using adversarial networks is used in [12] where a special focus is placed on the generative process over a longer horizon in order the model be able to capture the semantic meaning in long text generation.…”
Section: Related Workmentioning
confidence: 99%
“…To tackle this issue of non-differentiability, policy learning has been suggested, which suffers from high variance during training (13,67). Therefore, specialized distributions, such as the Gumbel-softmax (68, 69), a concrete distribution (70), or a soft-argmax function (71), have been proposed to approximate the gradient of the model from discrete samples.…”
Section: A2 Conditional Generation With Autoencodersmentioning
confidence: 99%