Proceedings of ACL 2018, System Demonstrations 2018
DOI: 10.18653/v1/p18-4016
|View full text |Cite
|
Sign up to set email alerts
|

ScoutBot: A Dialogue System for Collaborative Navigation

Abstract: ScoutBot is a dialogue interface to physical and simulated robots that supports collaborative exploration of environments. The demonstration will allow users to issue unconstrained spoken language commands to ScoutBot. ScoutBot will prompt for clarification if the user's instruction needs additional input.It is trained on human-robot dialogue collected from Wizard-of-Oz experiments, where robot responses were initiated by a human wizard in previous interactions. The demonstration will show a simulated ground r… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
4
3
2

Relationship

2
7

Authors

Journals

citations
Cited by 14 publications
(7 citation statements)
references
References 5 publications
0
6
0
Order By: Relevance
“…The symptoms which dialogue system inquiries should be related with underlying disease and consistent with medical knowledge. Current taskoriented dialogue systems (Lei et al 2018;Lukin et al 2018;Bordes, Boureau, and Weston 2016) highly rely on the complex belief tracker (Wen et al ;Mrkšić et al 2016) and pure data-driven learning, which are unable to apply to automatic diagnosis directly for the lack of considering medical knowledge. A very recent work (Wei et al 2018) made the first move to build a dialogue system for automatic diagnosis, which cast dialogue systems as Markov Decision Process and trained the dialogue policy via reinforcement learn-ing.…”
Section: Introductionmentioning
confidence: 99%
“…The symptoms which dialogue system inquiries should be related with underlying disease and consistent with medical knowledge. Current taskoriented dialogue systems (Lei et al 2018;Lukin et al 2018;Bordes, Boureau, and Weston 2016) highly rely on the complex belief tracker (Wen et al ;Mrkšić et al 2016) and pure data-driven learning, which are unable to apply to automatic diagnosis directly for the lack of considering medical knowledge. A very recent work (Wei et al 2018) made the first move to build a dialogue system for automatic diagnosis, which cast dialogue systems as Markov Decision Process and trained the dialogue policy via reinforcement learn-ing.…”
Section: Introductionmentioning
confidence: 99%
“…This design permitted participants to instruct the robot without imposing artificial restrictions on the language used. As more data was collected, increasing levels of automated dialogue processing were introduced (Lukin et al, 2018a). We discuss the impact of further design details in Sections 4 and 5.…”
Section: Human-robot Dialoguementioning
confidence: 99%
“…The key milestones in the progression included using human wizards through Experiments 1-4 (prior and ongoing work) with different methods of performing the task (typing or pressing buttons that have predefined text messages) to build up the databases of dialogue interactions, a dialogue manager, and robot behaviors. The data collected in these prior experiments was used to train the ScoutBot system as an end-to-end, fully autonomous dialogue management and autonomous robot implementation (Lukin et al, 2018) for control of a single simulated robot in an indoor simulated building.…”
Section: Scenarios and Research Approachmentioning
confidence: 99%
“…Dialogue processing utilizes the NPCEditor dialogue manager (Leuski and Traum, 2011;Hartholt et al, 2013). The NLU/DM module interprets dialogue instructions and produces responses using statistical retrieval algorithms from prior dialogue system implementations (Traum et al, 2015;Lukin et al, 2018) which allow for a range of unconstrained speech input. This testing sce-nario uses a novel configuration with five categories of instructions: (1) wake (get a particular robot's attention), ( 2) waypoint navigation of one or more robots, (3) follow-behind commands, (4) inspection, and (5) patrol of a pre-defined area.…”
Section: Spoken Dialogue and Dialogue Managementmentioning
confidence: 99%