Findings of the Association for Computational Linguistics: EMNLP 2023 2023
DOI: 10.18653/v1/2023.findings-emnlp.313
|View full text |Cite
|
Sign up to set email alerts
|

Dior-CVAE: Pre-trained Language Models and Diffusion Priors for Variational Dialog Generation

Tianyu Yang,
Thy Tran,
Iryna Gurevych

Abstract: Current variational dialog models have employed pre-trained language models (PLMs) to parameterize the likelihood and posterior distributions. However, the Gaussian assumption made on the prior distribution is incompatible with these distributions, thus restricting the diversity of generated responses. These models also suffer from posterior collapse, i.e., the decoder tends to ignore latent variables and directly access information captured in the encoder through the cross-attention mechanism.In this work, we… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...

Citation Types

0
0
0

Publication Types

Select...

Relationship

0
0

Authors

Journals

citations
Cited by 0 publications
references
References 29 publications
0
0
0
Order By: Relevance

No citations

Set email alert for when this publication receives citations?