Proceedings of the 2019 Conference on Empirical Methods in Natural Language Processing and the 9th International Joint Conferen 2019
DOI: 10.18653/v1/d19-1048
|View full text |Cite
|
Sign up to set email alerts
|

Latent-Variable Generative Models for Data-Efficient Text Classification

Abstract: Generative classifiers offer potential advantages over their discriminative counterparts, namely in the areas of data efficiency, robustness to data shift and adversarial examples, and zero-shot learning (Ng and Jordan, 2002;Yogatama et al., 2017;Lewis and Fan, 2019). In this paper, we improve generative text classifiers by introducing discrete latent variables into the generative story, and explore several graphical model configurations. We parameterize the distributions using standard neural architectures us… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

1
6
0

Year Published

2020
2020
2022
2022

Publication Types

Select...
4

Relationship

2
2

Authors

Journals

citations
Cited by 4 publications
(7 citation statements)
references
References 30 publications
1
6
0
Order By: Relevance
“…The accuracies become comparable when we have 1000 instances per label. We also see that on the full training set, the discriminative baselines outperform GenNLI, which accords with our expectations and the findings of prior work (Ding and Gimpel, 2019).…”
Section: Data Efficiencysupporting
confidence: 91%
See 3 more Smart Citations
“…The accuracies become comparable when we have 1000 instances per label. We also see that on the full training set, the discriminative baselines outperform GenNLI, which accords with our expectations and the findings of prior work (Ding and Gimpel, 2019).…”
Section: Data Efficiencysupporting
confidence: 91%
“…Most neural network classifiers are trained as discriminative classifiers as these work better when conditions are favorable for supervised learning, namely that training data is plentiful and that the training and test data are drawn from the same distribution. While discriminative classifiers are generally preferred in practice, there is certain prior work showing that generative classifiers can have advantages in certain conditions, especially when training data is scarce, noisy, and imbalanced (Yogatama et al, 2017;Lewis and Fan, 2019;Ding and Gimpel, 2019). Ng and Jordan (2002) proved theoretically that generative classifiers can approach their asymptotic error much faster, as naïve Bayes is faster than its discriminative analogue, logistic regression.…”
Section: Generative Classifiersmentioning
confidence: 99%
See 2 more Smart Citations
“…However, when data quality and conditions are not ideal, there is a substantial performance decrease for existing discriminative models, including both simple model architectures and more complex ones. Prior work on document classification and question answering has shown that generative classifiers have advantages over their discriminative counterparts in non-ideal conditions (Yogatama et al, 2017;Lewis and Fan, 2019;Ding and Gimpel, 2019).…”
Section: Introductionmentioning
confidence: 99%