2011
DOI: 10.1523/jneurosci.3404-11.2011
|View full text |Cite
|
Sign up to set email alerts
|

Multiple Reference Frames in Cortical Oscillatory Activity during Tactile Remapping for Saccades

Abstract: Single-unit recordings have shown that the brain uses multiple reference frames in spatial processing. The brain could use this neural architecture to implicitly create multiple modes of representation at the population level, with each reference frame weighted as a function of task demands. Using magnetoencephalography, we tested this hypothesis by studying the reference frames in rhythmic neuronal synchronization-a population measure-during tactile remapping for saccades. Human subjects fixated either to the… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

7
70
1

Year Published

2012
2012
2023
2023

Publication Types

Select...
7
1

Relationship

0
8

Authors

Journals

citations
Cited by 57 publications
(78 citation statements)
references
References 43 publications
7
70
1
Order By: Relevance
“…Rather, electrophysiological studies have demonstrated that the original, anatomical representation of touch is retained after remapping (Heed and Röder, 2010;Buchholz et al, 2011Buchholz et al, , 2013, supporting the notion that the brain uses both spatial codes, anatomical and external, to estimate tactile location (Shore et al, 2002; Contents lists available at ScienceDirect Cadieux et al, 2010). Thus, touch localization may indeed follow the principles of weighted integration (Eardley and van Velzen, 2011;de Haan et al, 2012;Heed et al, 2012;Badde et al, 2013;Heed and Azañón, 2014) which underlie many instances of crossmodal integration (Ernst and Bülthoff, 2004).…”
Section: Introductionmentioning
confidence: 85%
“…Rather, electrophysiological studies have demonstrated that the original, anatomical representation of touch is retained after remapping (Heed and Röder, 2010;Buchholz et al, 2011Buchholz et al, , 2013, supporting the notion that the brain uses both spatial codes, anatomical and external, to estimate tactile location (Shore et al, 2002; Contents lists available at ScienceDirect Cadieux et al, 2010). Thus, touch localization may indeed follow the principles of weighted integration (Eardley and van Velzen, 2011;de Haan et al, 2012;Heed et al, 2012;Badde et al, 2013;Heed and Azañón, 2014) which underlie many instances of crossmodal integration (Ernst and Bülthoff, 2004).…”
Section: Introductionmentioning
confidence: 85%
“…Moreover, a single weight parameter for each reference frame was sufficient to account for uncrossed and for crossed performance, indicating that the integration follows the same principles in both postures. Effects of body posture on touch processing have often been attributed to the external reference frame of touch (Driver & Spence, 1998;Aglioti et al, 1999;Kennett et al, 2001;Yamamoto & Kitazawa, 2001a;Shore et al, 2002;Soto-Faraco et al, 2004;Röder et al, 2004;Eimer et al, 2004;Bolognini & Maravita, 2007;Azañón & Soto-Faraco, 2008;Heed et al, 2012;Buchholz et al, 2011;. This is because body posture determines the location of the stimulated skin region in space, but should not influence the processing of any other tactile stimulus characteristics .…”
Section: Discussionmentioning
confidence: 99%
“…This process of coordinate transformation is addressed as tactile remapping (Driver & Spence, 1998) and has been associated with regions of the intraparietal sulcus in posterior parietal cortex (Azañón et al, 2010b;Bolognini & Maravita, 2007;Renzi et al, 2013). Crucially, both the original, anatomical representation as well as the remapped, external representation are maintained by the brain (Heed & Röder, 2010;Buchholz et al, 2011;, that is, after transformation both reference frames are available to estimate the location of the touch. Consequently, the brain might base tactile localization by default on information coded in both reference frames, rather than relying responses on the external reference frame alone.…”
Section: Introductionmentioning
confidence: 99%
“…In contrast, when we grope in a dark room looking for a light switch, we may be aware of which part of our hand has contacted the switch, but our primary aim is to localise the switch as an object in external space. A large recent literature has begun to investigate this ability to localise tactile stimuli in external space (e.g., Azañón & Soto-Faraco, 2008;Azañón, Camacho, & Soto-Faraco, 2010a;Azañón, Longo, Soto-Faraco, & Haggard, 2010b;Bolognini & Maravita, 2007;Buchholz, Jensen, & Medendorp, 2011;Heed & Röder, 2010;Heed, Backhaus, & Röder, 2012;Overvliet, Azañón, & Soto-Faraco, 2011;Schicke & Röder, 2006). External spatial localisation requires that tactile information about the location of a stimulus in contact with the skin surface be integrated with proprioceptive or other information about body posture -a process know as tactile spatial remapping.…”
Section: Introductionmentioning
confidence: 99%