2018
DOI: 10.1155/2018/7529286
|View full text |Cite
|
Sign up to set email alerts
|

An Integrated Deep Generative Model for Text Classification and Generation

Abstract: Text classification and generation are two important tasks in the field of natural language processing. In this paper, we deal with both tasks via Variational Autoencoder, which is a powerful deep generative model. The self-attention mechanism is introduced to the encoder. The modified encoder extracts the global feature of the input text to produce the hidden code, and we train a neural network classifier based on the hidden code to perform the classification. On the other hand, the label of the text is fed i… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
3
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
5

Relationship

0
5

Authors

Journals

citations
Cited by 5 publications
(4 citation statements)
references
References 19 publications
0
3
0
Order By: Relevance
“…5): autoregressive generative models (ARM), flow-based models, and latent variable models. Text analysis [30], image analysis [29], audio analysis [31], active learning [32], reinforcement learning [33], graph analysis [34], medical imaging [35], image compression [36], and other applications use deep generative modeling. Unsupervised learning is a subfield of machine learning that contains numerous algorithms with varying objectives.…”
Section: Generative Modelmentioning
confidence: 99%
“…5): autoregressive generative models (ARM), flow-based models, and latent variable models. Text analysis [30], image analysis [29], audio analysis [31], active learning [32], reinforcement learning [33], graph analysis [34], medical imaging [35], image compression [36], and other applications use deep generative modeling. Unsupervised learning is a subfield of machine learning that contains numerous algorithms with varying objectives.…”
Section: Generative Modelmentioning
confidence: 99%
“…Although EDA reduces overfitting when training on smaller datasets, the improvement is at times marginal. Wang and Wu [25] proposed a framework that combines variational autoencoder (VAE) and neural networks to deal with text classification and generation tasks. GAN [10,26] was firstly proposed for continuous data (image generation, inpainting, style transfer, etc.)…”
Section: Related Workmentioning
confidence: 99%
“…The experimental results demonstrate that the classification model based on deep learning shows excellent performance compared to the kernel functions [3]. With the proposal of CNN classification model, more and more deep neural network models [4]- [8] have achieved gratifying results in the task of text classification.…”
Section: Introductionmentioning
confidence: 97%