Proceedings of the 17th Annual Meeting of the Special Interest Group on Discourse and Dialogue 2016
DOI: 10.18653/v1/w16-3622
|View full text |Cite
|
Sign up to set email alerts
|

A Context-aware Natural Language Generator for Dialogue Systems

Abstract: We present a novel natural language generation system for spoken dialogue systems capable of entraining (adapting) to users' way of speaking, providing contextually appropriate responses. The generator is based on recurrent neural networks and the sequence-to-sequence approach. It is fully trainable from data which include preceding context along with responses to be generated. We show that the context-aware generator yields significant improvements over the baseline in both automatic metrics and a human pairw… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

1
69
0
1

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
4

Relationship

1
8

Authors

Journals

citations
Cited by 56 publications
(71 citation statements)
references
References 31 publications
1
69
0
1
Order By: Relevance
“…An example MR-reference pair is shown in Figure 1, Table 1 lists all the attributes in our domain. In contrast to previous work (Mairesse et al, 2010;Wen et al, 2015a;Dušek and Jurčíček, 2016), we use different modalities of meaning representation for data collection: textual/logical and pictorial MRs. The textual/logical MRs (see Figure 1) take the form of a sequence with attributevalue pairs provided in a random order.…”
Section: The E2e Nlg Dataset 21 Data Collection Proceduresmentioning
confidence: 99%
“…An example MR-reference pair is shown in Figure 1, Table 1 lists all the attributes in our domain. In contrast to previous work (Mairesse et al, 2010;Wen et al, 2015a;Dušek and Jurčíček, 2016), we use different modalities of meaning representation for data collection: textual/logical and pictorial MRs. The textual/logical MRs (see Figure 1) take the form of a sequence with attributevalue pairs provided in a random order.…”
Section: The E2e Nlg Dataset 21 Data Collection Proceduresmentioning
confidence: 99%
“…These models are completely or partially rule-based and focus on specific aspects of entrainment. Recent work aims at introducing entrainment in a fully trainable natural language system by exploiting the preceding user utterance (Dušek and Jurcıcek, 2016).…”
Section: Related Workmentioning
confidence: 99%
“…Figure 1 summarizes the model architecture. This model builds on the open-source sequence-tosequence (seq2seq) TGen system [11], which is implemented in Tensorflow [12]. 2 The system is based on the seq2seq generation method with attention [14,15], and uses a sequence of LSTMs [16] for the encoder and decoder, combined with beamsearch and an n-best list reranker for output tuning.…”
Section: Data and Modelsmentioning
confidence: 99%
“…The inputs to the model are dialog acts for each system action (such as inform) and a set of attribute slots (such as rating) and their values (such as high for attribute rating). To prepro- 2 We refer the reader to TGen publications [11,13] for model details. We encode personality as an additional dialog act, of type CONVERT with personality as the key and the target personality as the value (see Figure 1).…”
Section: Data and Modelsmentioning
confidence: 99%