Proceedings of the 57th Annual Meeting of the Association for Computational Linguistics 2019
DOI: 10.18653/v1/p19-1373
|View full text |Cite
|
Sign up to set email alerts
|

Pretraining Methods for Dialog Context Representation Learning

Abstract: This paper examines various unsupervised pretraining objectives for learning dialog context representations. Two novel methods of pretraining dialog context encoders are proposed, and a total of four methods are examined. Each pretraining objective is fine-tuned and evaluated on a set of downstream dialog tasks using the MultiWoz dataset and strong performance improvement is observed. Further evaluation shows that our pretraining objectives result in not only better performance, but also better convergence, mo… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
68
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 63 publications
(72 citation statements)
references
References 27 publications
1
68
0
Order By: Relevance
“…When it comes to dialog systems, response selection provides a more suitable pretraining task for learning representations that can encapsulate conversational cues. Such models can be pretrained using large corpora of natural unlabelled conversational data (Henderson et al, 2019b;Mehri et al, 2019). Response selection is also directly applicable to retrieval-based dialog systems, a popular and elegant approach to framing dialog (Wu et al, 2017;Weston et al, 2018;Mazaré et al, 2018;Gunasekara et al, 2019;Henderson et al, 2019b).…”
Section: Introductionmentioning
confidence: 99%
“…When it comes to dialog systems, response selection provides a more suitable pretraining task for learning representations that can encapsulate conversational cues. Such models can be pretrained using large corpora of natural unlabelled conversational data (Henderson et al, 2019b;Mehri et al, 2019). Response selection is also directly applicable to retrieval-based dialog systems, a popular and elegant approach to framing dialog (Wu et al, 2017;Weston et al, 2018;Mazaré et al, 2018;Gunasekara et al, 2019;Henderson et al, 2019b).…”
Section: Introductionmentioning
confidence: 99%
“…This idea can also be applied to task-oriented dialog systems to transfer general natural language knowledge from large-scale corpora to a specific dialog task. Some early studies have shown the possibility of using pre-training models to model task-oriented dialogs [46,99,100,130,131].…”
Section: Discussion and Future Trendsmentioning
confidence: 99%
“…In recent years, the pretraining models have a huge impact in the field of natural language understanding and natural language generation. Mehri et al [58] studied the sentence representation based on the pretraining models and employed the pretrained representation in dialogue generation task. They proved that the pretraining model can be utilized in open-dialogue generation task.…”
Section: Pretraining-model-based Methodsmentioning
confidence: 99%