Proceedings of the 16th Annual Meeting of the Special Interest Group on Discourse and Dialogue 2015
DOI: 10.18653/v1/w15-4639
|View full text |Cite
|
Sign up to set email alerts
|

Stochastic Language Generation in Dialogue using Recurrent Neural Networks with Convolutional Sentence Reranking

Abstract: The natural language generation (NLG) component of a spoken dialogue system (SDS) usually needs a substantial amount of handcrafting or a well-labeled dataset to be trained on. These limitations add significantly to development costs and make cross-domain, multi-lingual dialogue systems intractable. Moreover, human languages are context-aware. The most natural response should be directly learned from data rather than depending on predefined syntaxes or rules. This paper presents a statistical language generato… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
79
0
1

Year Published

2016
2016
2023
2023

Publication Types

Select...
4
4
2

Relationship

0
10

Authors

Journals

citations
Cited by 85 publications
(94 citation statements)
references
References 38 publications
0
79
0
1
Order By: Relevance
“…To ensure that the output trees/strings correspond semantically to the input DA, we implemented a classifier to rerank the n-best beam search outputs and penalize those missing required information and/or adding irrelevant one. Similarly to Wen et al (2015a), the classifier provides a binary decision for an output tree/string on the presence of all dialogue act types and slot-value combinations seen in the training data, producing a 1-hot vector.…”
Section: Rerankermentioning
confidence: 99%
“…To ensure that the output trees/strings correspond semantically to the input DA, we implemented a classifier to rerank the n-best beam search outputs and penalize those missing required information and/or adding irrelevant one. Similarly to Wen et al (2015a), the classifier provides a binary decision for an output tree/string on the presence of all dialogue act types and slot-value combinations seen in the training data, producing a 1-hot vector.…”
Section: Rerankermentioning
confidence: 99%
“…Both of them completely learn from the data and thus has no limitation on domain transfer. Recently, as the powerful of deep neural network on learning from large-scale data, [16] proposed a statistical dialogue generator based on a joint recurrent and convolutional neural network, which can directly learn from the data without any semantic alignment or handcrafted rules. Further, [15] and [14] proposed a semantically conditioned LSTM to generate dialogue response and then compared it with an RNN encoder-decoder generator on multi-domain data to verify the ability of domain adaptation of the two generators.…”
Section: • Task-oriented Dialogue Generationmentioning
confidence: 99%
“…Bahdanau et al [1] introduce the attention mechanism into the framework, and their system known as RNNsearch algorithm can jointly learn alignment and translation, and significantly improve the translation quality. This framework has also been used in natural language dialogue [13,17,12,19,18], where the end-to-end neural dialogue model is trained on a large amount of conversation data. Although promising, neural dialogue models still have problems and limitations, e.g., the lack of mechanism to incorporate knowledge.…”
Section: Related Workmentioning
confidence: 99%