Proceedings of the 58th Annual Meeting of the Association for Computational Linguistics 2020
DOI: 10.18653/v1/2020.acl-main.163
|View full text |Cite
|
Sign up to set email alerts
|

A Generative Model for Joint Natural Language Understanding and Generation

Abstract: Natural language understanding (NLU) and natural language generation (NLG) are two fundamental and related tasks in building task-oriented dialogue systems with opposite objectives: NLU tackles the transformation from natural language to formal representations, whereas NLG does the reverse. A key to success in either task is parallel training data which is expensive to obtain at a large scale. In this work, we propose a generative model which couples NLU and NLG through a shared latent variable. This approach … Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

1
34
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
2
1

Relationship

0
7

Authors

Journals

citations
Cited by 22 publications
(35 citation statements)
references
References 33 publications
(41 reference statements)
1
34
0
Order By: Relevance
“…Task-Oriented Dialogue Models: Most taskoriented dialogue systems break down the task into three components: belief tracking (Henderson et al, 2013;Mrkšić et al, 2016;Rastogi et al, 2017;Nouri and Hosseini-Asl, 2018;Wu et al, 2019a;Zhou and Small, 2019;Heck et al, 2020), dialogue act prediction (Wen et al, 2017a;Tanaka et al, 2019), and response generation Budzianowski et al, 2018;Lippe et al, 2020). Traditionally, a modular approach is adopted, where these components are optimized independently (i.e., a pipeline design) or learned via multi-task learning (i.e., some parameters are shared among the components) (Wen et al, 2017b;Zhao et al, 2019;Mehri et al, 2019;Tseng et al, 2020;. However, it is known that improvements in one component do not necessarily lead to overall performance improvements (Ham et al, 2020), and the modular approach suffers from error propagation in practice (Liu and Lane, 2018).…”
Section: Related Workmentioning
confidence: 99%
“…Task-Oriented Dialogue Models: Most taskoriented dialogue systems break down the task into three components: belief tracking (Henderson et al, 2013;Mrkšić et al, 2016;Rastogi et al, 2017;Nouri and Hosseini-Asl, 2018;Wu et al, 2019a;Zhou and Small, 2019;Heck et al, 2020), dialogue act prediction (Wen et al, 2017a;Tanaka et al, 2019), and response generation Budzianowski et al, 2018;Lippe et al, 2020). Traditionally, a modular approach is adopted, where these components are optimized independently (i.e., a pipeline design) or learned via multi-task learning (i.e., some parameters are shared among the components) (Wen et al, 2017b;Zhao et al, 2019;Mehri et al, 2019;Tseng et al, 2020;. However, it is known that improvements in one component do not necessarily lead to overall performance improvements (Ham et al, 2020), and the modular approach suffers from error propagation in practice (Liu and Lane, 2018).…”
Section: Related Workmentioning
confidence: 99%
“…Learning with Semi-Supervision. Work on semi-supervised learning considers settings with some labeled data and a much larger set of unlabeled data, and then leverages both labeled the unlabeled data as in machine translation (Artetxe et al, 2017;Lample et al, 2017), data-to-text generation (Schmitt and Schütze, 2019;Qader et al, 2019) or more relevantly the joint learning framework for training NLU and NLG (Tseng et al, 2020;. Nonetheless, these approaches all assume that a large collection of text is available, which is an unrealistic assumption for the task due to the need for expert curation.…”
Section: Related Workmentioning
confidence: 99%
“…Natural language generation (NLG) is the task that transforms meaning representations (MR) into natural language descriptions (Reiter and Dale, 2000; Barzilay and Lapata, 2005); while natural language understanding (NLU) is the opposite process where text is converted into MR (Zhang and Wang, 2016). These two processes can thus constrain each other -recent exploration of the duality of neural natural language generation (NLG) and understanding (NLU) has led to successful semi-supervised learning techniques where both labeled and unlabeled data can be used for training Tseng et al, 2020;Schmitt and Schütze, 2019;Qader et al, 2019;.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations