2010
DOI: 10.3389/fnbot.2010.00008
|View full text |Cite
|
Sign up to set email alerts
|

Linking language with embodied and teleological representations of action for humanoid cognition

Abstract: The current research extends our framework for embodied language and action comprehension to include a teleological representation that allows goal-based reasoning for novel actions. The objective of this work is to implement and demonstrate the advantages of a hybrid, embodied-teleological approach to action–language interaction, both from a theoretical perspective, and via results from human–robot interaction experiments with the iCub robot. We first demonstrate how a framework for embodied language comprehe… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
33
0

Year Published

2012
2012
2017
2017

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 25 publications
(33 citation statements)
references
References 63 publications
0
33
0
Order By: Relevance
“…In particular, the ability of children to take a 'birdseye-view' and understand their and their partners actions as part of a shared plan [8] is a concept we have recently implemented in several robotic systems [9], [10], [11]. Similar work by Shah et al [12], has recently demonstrated that human-robot team performance is improved by shared planning in comparison to robots that are verbally commanded by the human.…”
Section: Contextmentioning
confidence: 89%
“…In particular, the ability of children to take a 'birdseye-view' and understand their and their partners actions as part of a shared plan [8] is a concept we have recently implemented in several robotic systems [9], [10], [11]. Similar work by Shah et al [12], has recently demonstrated that human-robot team performance is improved by shared planning in comparison to robots that are verbally commanded by the human.…”
Section: Contextmentioning
confidence: 89%
“…Yet the true notion of the actual final goal, the shared intention, to get that toy into the box, is currently not present. We have started to address this issue by linking actions to their resulting states, within the action representation [56]. We must go further, in order to now expand the language capability to address the expression and modification of internal representations of the intentional states of others.…”
Section: Discussion and Future Workmentioning
confidence: 99%
“…John gave Sally a flower), Subject maps onto the agent of the transitive action specified by Verb, and Recipient receives the Object via that transitive action. The current research demonstrates how language, based on these constructions, can be used to coordinate real-time learning of cooperative actions, providing the coordination of multiple demonstration modalities including vision-like perception, kinesthetic demonstration [13,29,[55][56][57][58], and command execution via spoken language. In this sense, language serves a dual purpose: First and most important, it provides the mechanism by which a cooperative plan can be constructed and modified.…”
Section: Discussion and Future Workmentioning
confidence: 99%
“…Learning from demonstration (Argall, Chernova, Veloso, & Browning, 2009) and imitation learning of movement primitives (Ijspert, Nakanishi, & Schaal, 2002;Mochizuki, Nishide, Okuno, & Ogata, 2013) are among the most successful attempts to implement robotic learning systems. Using insights from work on human-robot cooperation (Dominey, Mallet, & Yoshida, 2009;Lallee, Madden, Hoen, & Dominey, 2010; and segmentation of observed action sequences (GuerraFilho & Aloimonos, 2006;Wörgötter et al, 2013), in our work, we aim at more ''cognitive'' actions, which not only specify the desired trajectory of the robot's effectors, but also the properties of the target object and its desired final state (Cuijpers, Schie, Koppen, Erlhagen, & Bekkering, 2006). Learning such goal-directed behaviors and learning sequence of such behaviors simultaneously is an open problem.…”
Section: Related Workmentioning
confidence: 98%