Proceedings of the 59th Annual Meeting of the Association for Computational Linguistics and the 11th International Joint Confer 2021
DOI: 10.18653/v1/2021.acl-short.111
|View full text |Cite
|
Sign up to set email alerts
|

Preview, Attend and Review: Schema-Aware Curriculum Learning for Multi-Domain Dialogue State Tracking

Abstract: Existing dialog state tracking (DST) models are trained with dialog data in a random order, neglecting rich structural information in a dataset. In this paper, we propose to use curriculum learning (CL) to better leverage both the curriculum structure and schema structure for task-oriented dialogs. Specifically, we propose a model-agnostic framework called Schema-aware Curriculum Learning for Dialog State Tracking (SaCLog), which consists of a preview module that pre-trains a DST model with schema information,… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

0
11
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
6
4

Relationship

1
9

Authors

Journals

citations
Cited by 21 publications
(12 citation statements)
references
References 30 publications
0
11
0
Order By: Relevance
“…Another benefit of using pretrained GPT2 is faster training time as we observed the VDTN+GPT2 converged much earlier than training it from scratch. From these observations, we are excited to see more future extension of SOTA unimodal DST models (Lin et al, 2021;Dai et al, 2021) and large pretrained LMs (Brown et al, 2020;Raffel et al, 2020), especially ones with multimodal learning such as (Lu et al, 2019;, to MM-DST task.…”
Section: Resultsmentioning
confidence: 99%
“…Another benefit of using pretrained GPT2 is faster training time as we observed the VDTN+GPT2 converged much earlier than training it from scratch. From these observations, we are excited to see more future extension of SOTA unimodal DST models (Lin et al, 2021;Dai et al, 2021) and large pretrained LMs (Brown et al, 2020;Raffel et al, 2020), especially ones with multimodal learning such as (Lu et al, 2019;, to MM-DST task.…”
Section: Resultsmentioning
confidence: 99%
“…Traditional dialogue systems [17,33] usually consist of three components: natural language understanding (NLU) [28,30,58,59], dialogue management (DM) [6,7,18], and natural language generation (NLG) [50,63,65,66] modules. Empirically, NLU plays the most important role in task-oriented dialogue systems, including tasks such as intent detection [12,13,29,57], slot filling [61], and semantic parsing [19?…”
Section: Related Workmentioning
confidence: 99%
“…Broadly, multi-domain learning (Joshi et al, 2012) includes training and evaluation using data from multiple domains (e.g. Dredze and Crammer (2008); Qin et al (2020); Dai et al (2021b)). Sometimes, it is assumed that the input texts are accompanied by a domain label (e.g.…”
Section: Introductionmentioning
confidence: 99%