2018
DOI: 10.48550/arxiv.1810.00278
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

MultiWOZ -- A Large-Scale Multi-Domain Wizard-of-Oz Dataset for Task-Oriented Dialogue Modelling

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
150
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
3
3
1

Relationship

0
7

Authors

Journals

citations
Cited by 82 publications
(168 citation statements)
references
References 0 publications
1
150
0
Order By: Relevance
“…the performance of different methods on the MultiWoZ dataset[12]. Consistent with intuition, the combination (i.e., PPO-OFF-Comb) of global and local reward reaches highest performance in both match rate (9% improvement) and success rate (8% improvement).…”
supporting
confidence: 71%
See 3 more Smart Citations
“…the performance of different methods on the MultiWoZ dataset[12]. Consistent with intuition, the combination (i.e., PPO-OFF-Comb) of global and local reward reaches highest performance in both match rate (9% improvement) and success rate (8% improvement).…”
supporting
confidence: 71%
“…All models are evaluated on MultiWOZ [12], a multi-domain, multi-intent task-oriented dialog corpus that contains 7 domains, 13 intents, 25 different slots and 10483 dialog sessions. We report the average number of dialog turns, averaging over successful dialog sessions and all dialog sessions respectively, to measure the efficient of accomplishing a task.…”
Section: Dataset and Evaluation Metricmentioning
confidence: 99%
See 2 more Smart Citations
“…Generation-based methods in dialogue systems are typically based on sequence-to-sequence modeling. These models are usually trained on a hand-labeled corpus of task-oriented dialogue [3]. Our proposed approach shares elements of both of these methods: it generates questions using a sequence-to-sequence model and stores them in a collection that can be queried using retrieval-based methods.…”
Section: Question Generationmentioning
confidence: 99%