Proceedings of the 4th Workshop on Representation Learning for NLP (RepL4NLP-2019) 2019
DOI: 10.18653/v1/w19-4303
|View full text |Cite
|
Sign up to set email alerts
|

Generative Adversarial Networks for Text Using Word2vec Intermediaries

Abstract: Generative adversarial networks (GANs) have shown considerable success, especially in the realistic generation of images. In this work, we apply similar techniques for the generation of text. We propose a novel approach to handle the discrete nature of text, during training, using word embeddings. Our method is agnostic to vocabulary size and achieves competitive results relative to methods with various discrete gradient estimators. L = L critic + λEx ∼p(x) [(||∇xD(x)|| 2 − 1) 2 ] (2)

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
4
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(4 citation statements)
references
References 20 publications
0
4
0
Order By: Relevance
“…Moreover, there are many specific tricks that can accelerate the training of GAN. In addition to the way mentioned in Section 4.6.2, one can see Bose, Ling, and Cao (2018) and Budhkar et al (2019) for more suggestions.…”
Section: Further Discussion C1 Limitation: Time Complexitymentioning
confidence: 99%
See 1 more Smart Citation
“…Moreover, there are many specific tricks that can accelerate the training of GAN. In addition to the way mentioned in Section 4.6.2, one can see Bose, Ling, and Cao (2018) and Budhkar et al (2019) for more suggestions.…”
Section: Further Discussion C1 Limitation: Time Complexitymentioning
confidence: 99%
“…GAN-based negative sampler has also been used in other lines of NLP research (Wang, Liu, and Zhao, 2017;Wang, Li, and Pan, 2018;Dai et al, 2019;Budhkar et al, 2019). For example, Wang, Li, and Pan (2018) employed GAN-based framework on knowledge representation learning and Budhkar et al (2019) imposed GAN-based method to generate natural language.…”
Section: Related Workmentioning
confidence: 99%
“…The discontinuous nature of the text is handled using a GAN2vec technique. The experimental GAN2vec evaluation on the dataset of Chinese poetry yields a BLUE score of 66.08% (Budhkar et al 2019). An ensemble approach is suggested to classify brief text sequences from the texts of various Arabic-speaking nations.…”
Section: Named Entity Recognition and Recommendation Systemmentioning
confidence: 99%
“…Common pretrained word embedding techniques include Word2Vec [102,103], GloVE [104], and fastText [105]. Word embedding representations are often used as input to models that tackle NLP tasks, such as sentiment analysis (e.g., a CNN with Word2vec and GloVE in [106], a CNN with Word2Vec in [107]), text classification (e.g., Support Vector Machines (SVM) [108] accompanied with Word2Vec, TF-IDF in [109]), question answering (e.g., a dependency-tree recurrent neural network with Word2Vec in [110]), text generation (e.g., Generative Adversarial Networks with Word2Vec in [111]).…”
Section: Text Modalitymentioning
confidence: 99%