2011
DOI: 10.1007/s00221-011-2596-0
|View full text |Cite
|
Sign up to set email alerts
|

Testing the limits of optimal integration of visual and proprioceptive information of path trajectory

Abstract: Many studies provide evidence that information from different modalities is integrated following the maximum likelihood estimation model (MLE). For instance, we recently found that visual and proprioceptive path trajectories are optimally combined (Reuschel et al. in Exp Brain Res 201:853-862, 2010). However, other studies have failed to reveal optimal integration of such dynamic information. In the present study, we aim to generalize our previous findings to different parts of the workspace (central, ipsilate… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

1
2
1

Year Published

2012
2012
2020
2020

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 8 publications
(4 citation statements)
references
References 58 publications
1
2
1
Order By: Relevance
“…van Beers et al (1999) asked their participants to move their target-hand to a position as close to the final target location as possible and then touch the underside of the experimental set up until they found a tactile marker. Previous work in our laboratory , a reaching task with gaze deviated, using a robot manipulandum for proprioceptive target placement also did not find support for Bayesian integration of remembered visual information with online or remembered proprioceptive information, nor did previous studies that have employed precise placement of the target-hand using a mechanical apparatus (Monaco et al, 2010;Reuschel, Rosler, Henriques, & Fiehler, 2011). Last, in our study, the last target site was the start of the path to the next target site.…”
Section: Combining Vision and Proprioceptioncontrasting
confidence: 83%
“…van Beers et al (1999) asked their participants to move their target-hand to a position as close to the final target location as possible and then touch the underside of the experimental set up until they found a tactile marker. Previous work in our laboratory , a reaching task with gaze deviated, using a robot manipulandum for proprioceptive target placement also did not find support for Bayesian integration of remembered visual information with online or remembered proprioceptive information, nor did previous studies that have employed precise placement of the target-hand using a mechanical apparatus (Monaco et al, 2010;Reuschel, Rosler, Henriques, & Fiehler, 2011). Last, in our study, the last target site was the start of the path to the next target site.…”
Section: Combining Vision and Proprioceptioncontrasting
confidence: 83%
“…This distribution does not support a dichotomous error attribution [14], [15] to either internal or, alternatively, external causes, but rather suggests a continuous attribution mechanism on the level of single trials. Such partial attribution of prediction errors to internal causes would be consistent with the notion that the perception of one’s actions builds on the integration of internal and external action-related cues [26], [27], [28], [29].…”
Section: Discussionsupporting
confidence: 78%
“…Studies on shape perception (Helbig and Ernst 2007), hand position sense (van Beers et al 1996), and path trajectory integration (Reuschel et al 2010) showed results consistent with a maximum-likelihood estimation (MLE) model. However, recent work on reaching and multimodal asynchronous curvature detection was not in agreement with a MLE model of visual-proprioceptive integration (Jones and Henriques 2010;Reuschel et al 2011;Winges et al 2010).…”
mentioning
confidence: 82%