Proceedings of the 2015 Conference on Empirical Methods in Natural Language Processing 2015
DOI: 10.18653/v1/d15-1199
|View full text |Cite
|
Sign up to set email alerts
|

Semantically Conditioned LSTM-based Natural Language Generation for Spoken Dialogue Systems

Abstract: Natural language generation (NLG) is a critical component of spoken dialogue and it has a significant impact both on usability and perceived quality. Most NLG systems in common use employ rules and heuristics and tend to generate rigid and stylised responses without the natural variation of human language. They are also not easily scaled to systems covering multiple domains and languages. This paper presents a statistical language generator based on a semantically controlled Long Short-term Memory (LSTM) struc… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
380
0
3

Year Published

2016
2016
2020
2020

Publication Types

Select...
4
4

Relationship

0
8

Authors

Journals

citations
Cited by 672 publications
(383 citation statements)
references
References 41 publications
0
380
0
3
Order By: Relevance
“…Schocher et al [62] show that sentiment classification can be learned with deep learning models by using word embeddings as an input layer to a recursive neural network (RNN). In the field of natural language generation, deep learning has also been shown to be successful through the modelling of semantic constraints [63]. Ma et al [64] propose a deep learning approach for sentence embedding, and show that their method can achieve state-of-the-art results for many sentence classification problems such as sentiment classification and question classification.…”
Section: Future Trendsmentioning
confidence: 99%
“…Schocher et al [62] show that sentiment classification can be learned with deep learning models by using word embeddings as an input layer to a recursive neural network (RNN). In the field of natural language generation, deep learning has also been shown to be successful through the modelling of semantic constraints [63]. Ma et al [64] propose a deep learning approach for sentence embedding, and show that their method can achieve state-of-the-art results for many sentence classification problems such as sentiment classification and question classification.…”
Section: Future Trendsmentioning
confidence: 99%
“…However, in the case that no response in the database could adequately respond to a given utterance, this approach will fail. Response generation [15]- [17] which has the ability to generate a new responses, is arguably robust in handling user input comparing to the other approach, however this approach sometimes generates unnatural responses that are incomprehensible to the user [9]. There have been a number of works on response generation for data-driven dialog systems.…”
Section: Related Workmentioning
confidence: 99%
“…Sordoni et al [15] employ an RNN architecture to generate responses from a social media corpus, and Vinyals et al [16] present a long short-term memory (LSTM) neural network encoderdecoders to generate dialog responses using movie subtitles or IT support line chats. More recently Wen et al [17] demonstrate a more advanced LSTM that able to control a response semantically by considering dialogue act feature.…”
Section: Related Workmentioning
confidence: 99%
See 1 more Smart Citation
“…Natural Language Generation (NLG) is a challenging problem that has drawn a lot of attention in the Natural Language Processing (NLP) community [15,17]. NLG is crucial for multiple NLP applications and problems, such as dialogue systems [20], text summarization [4], and text paraphrasing [7]. Poem generation is an instance of NLG that is particularly fascinating for its peculiar features.…”
Section: Introductionmentioning
confidence: 99%