Proceedings of the 56th Annual Meeting of the Association for Computational Linguistics (Volume 1: Long Papers) 2018
DOI: 10.18653/v1/p18-1101
|View full text |Cite
|
Sign up to set email alerts
|

Unsupervised Discrete Sentence Representation Learning for Interpretable Neural Dialog Generation

Abstract: The encoder-decoder dialog model is one of the most prominent methods used to build dialog systems in complex domains. Yet it is limited because it cannot output interpretable actions as in traditional systems, which hinders humans from understanding its generation process. We present an unsupervised discrete sentence representation learning method that can integrate with any existing encoderdecoder dialog models for interpretable response generation. Building upon variational autoencoders (VAEs), we present t… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

2
75
0

Year Published

2019
2019
2022
2022

Publication Types

Select...
4
3
2

Relationship

0
9

Authors

Journals

citations
Cited by 107 publications
(83 citation statements)
references
References 26 publications
2
75
0
Order By: Relevance
“…Inspired by the early templatebased generation method [26] and statistical machine translation (SMT) [72], sequence-to-sequence (Seq2seq) models [79,85,89,94] have become the most popular choice for dialog generation. Other frameworks, including conditional variational autoencoder (CVAE) [13,33,77,80,124,125] and generative adversarial network (GAN) [40,108], are also applied to dialog generation.…”
Section: Generation-based Methodsmentioning
confidence: 99%
“…Inspired by the early templatebased generation method [26] and statistical machine translation (SMT) [72], sequence-to-sequence (Seq2seq) models [79,85,89,94] have become the most popular choice for dialog generation. Other frameworks, including conditional variational autoencoder (CVAE) [13,33,77,80,124,125] and generative adversarial network (GAN) [40,108], are also applied to dialog generation.…”
Section: Generation-based Methodsmentioning
confidence: 99%
“…Connection to mutual information The proposed latent variable model coincides with (Zhao et al, 2018(Zhao et al, , 2017a where mutual information is introduced into the optimization, based on the following decomposition result (Please see detailed proof in Appendix A):…”
Section: Mutual Information Regularized Ivaementioning
confidence: 99%
“…Datasets. We consider two mainstream datasets in recent studies (Zhao et al, 2017b(Zhao et al, , 2018Fu et al, 2019;Gu et al, 2018): Switchboard (Godfrey and Holliman, 1997) and Dailydialog (Li et al, 2017c). Switchboard contains 2,400 two-way telephone conversations under 70 specified topics.…”
Section: Dialog Response Generationmentioning
confidence: 99%
“…Though the recent progress in recursive models allows the representation learning from the tree-structured data, previous studies have pointed out that, in practice, sequence models serve as a more simple yet robust alternative (Li et al, 2015). In this work, we follow the common practice in most conversation modeling research (Ritter et al, 2010;Joty et al, 2011;Zhao et al, 2018) to take a conversation as a sequence of turns. To this end, each conversation tree is flattened into root-to-leaf paths.…”
Section: Model Overviewmentioning
confidence: 99%