2009
DOI: 10.1162/coli.06-78-prep14
|View full text |Cite
|
Sign up to set email alerts
|

Applying Computational Models of Spatial Prepositions to Visually Situated Dialog

Abstract: This article describes the application of computational models of spatial prepositions to visually situated dialog systems. In these dialogs, spatial prepositions are important because people often use them to refer to entities in the visual context of a dialog. We first describe a generic architecture for a visually situated dialog system and highlight the interactions between the spatial cognition module, which provides the interface to the models of prepositional semantics, and the other components in the a… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

1
39
0
1

Year Published

2010
2010
2017
2017

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 62 publications
(43 citation statements)
references
References 27 publications
1
39
0
1
Order By: Relevance
“…For example, integrating world-knowledge [32] and/or linguistic ontological knowledge [3]; integrating spatial semantics into a compositional/attentional accounts of reference [23,24,31]; learning spatial semantics directly from sensor data using machine learning techniques [12,34]; modelling the functional aspects of spatial semantics in terms of predicting the dynamics of objects in the scene [10,42]; capturing the vagueness and gradation of spatial semantics [17,22,43]; and leveraging analogical reasoning mechanisms to enable agents to apply spatial semantics to new environments [13].…”
Section: Natural Language Processing and Spatial Reasoningmentioning
confidence: 99%
See 1 more Smart Citation
“…For example, integrating world-knowledge [32] and/or linguistic ontological knowledge [3]; integrating spatial semantics into a compositional/attentional accounts of reference [23,24,31]; learning spatial semantics directly from sensor data using machine learning techniques [12,34]; modelling the functional aspects of spatial semantics in terms of predicting the dynamics of objects in the scene [10,42]; capturing the vagueness and gradation of spatial semantics [17,22,43]; and leveraging analogical reasoning mechanisms to enable agents to apply spatial semantics to new environments [13].…”
Section: Natural Language Processing and Spatial Reasoningmentioning
confidence: 99%
“…If the user's perception of the world and the robot's perception diverge (e.g. due to problems in the object recognition software used by the robot [25], or mismatches in the user's and the robot's understanding of spatial relations [4,17]), misunderstandings may arise in the dialogue. In this paper, we investigate the effect of perceptionbased errors on human-robot dialogue, and how misunderstandings that arise from such errors are resolved.…”
Section: Introductionmentioning
confidence: 99%
“…In varying the objects in the diagram the study revealed that perceived distance was affected by the presence of neighbouring objects. In our work we do not attempt to consider such "distractor" effects [10], as they are beyond the scope of this study, though it is quite possible that they may have affected the decisions of participants in our human subject studies.…”
Section: Modelling the Applicability Of Spatial Prepositionsmentioning
confidence: 99%
“…There are many examples of spatial language generation in various domains, notably robotics [24,10]. In the geographic domain most such systems have focused on navigational instructions, e.g.…”
Section: Natural Language Generation Systemsmentioning
confidence: 99%
“…Computational field models of static relations have also been used in systems for object pick-and-place tasks on a tabletop [8], and for visually situated dialogue [11]. These works implemented pre-defined models of spatial relations, however, researchers have also designed systems capable of learning these types of static spatial relations automatically from training data (e.g., [10]).…”
Section: Related Workmentioning
confidence: 99%