2011
DOI: 10.1016/j.humov.2010.08.012
|View full text |Cite
|
Sign up to set email alerts
|

Neuro-cognitive mechanisms of decision making in joint action: A human–robot interaction study

Abstract: In this paper we present a model for action preparation and decision making in cooperative tasks that is inspired by recent experimental findings about the neuro-cognitive mechanisms supporting joint action in humans. It implements the coordination of actions and goals among the partners as a dynamic process that integrates contextual cues, shared task knowledge and predicted outcome of others' motor behavior. The control architecture is formalized by a system of coupled dynamic neural fields representing a di… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

0
75
0

Year Published

2011
2011
2021
2021

Publication Types

Select...
5
3
2

Relationship

2
8

Authors

Journals

citations
Cited by 61 publications
(77 citation statements)
references
References 45 publications
(65 reference statements)
0
75
0
Order By: Relevance
“…Efforts to model the perceptual, motor, and cognitive processes involved in joint action have been key in guiding experimentation and in providing constraints for theorizing (Bicho et al 2011;Butz and Pezzulo 2008;Cuijpers et al 2006). Three contributions in this special issue offer computational models that respectively target the role of shared representations for joint action, the processes governing continuous motor coordination, and the role of perception-action links.…”
Section: Computational Modelingmentioning
confidence: 99%
“…Efforts to model the perceptual, motor, and cognitive processes involved in joint action have been key in guiding experimentation and in providing constraints for theorizing (Bicho et al 2011;Butz and Pezzulo 2008;Cuijpers et al 2006). Three contributions in this special issue offer computational models that respectively target the role of shared representations for joint action, the processes governing continuous motor coordination, and the role of perception-action links.…”
Section: Computational Modelingmentioning
confidence: 99%
“…space of visual locations) to a discrete, categorical, and sparse representation of, e.g., perceived objects in space. The DNFs have been used in the past both to account for psychophysical data [25,7] and to control cognitive robots, in particular realising object recognition with fast, one-shot learning [8], scene representation for humanrobot interaction [1], organisation of robotic behaviour [18], or action recognition [11].…”
Section: Introductionmentioning
confidence: 99%
“…Nevertheless, where tactile interaction might be particularly relevant in human-robot interaction is in the domain of Joint Action (cf. [47][48][49]) where interaction on a goal-directed task may benefit from human actors conveying both negative (rejection-based) and positive (e.g., gratitude) feedback. Again, in reference to [45], communicating emotions such as anger/frustration may be critically important to informing interactors as to how the task is perceived to be going and how to respond accordingly (e.g., approach the task in a different way, or try harder).…”
mentioning
confidence: 99%