2005
DOI: 10.1152/jn.00021.2005
|View full text |Cite
|
Sign up to set email alerts
|

Eye-Centered, Head-Centered, and Complex Coding of Visual and Auditory Targets in the Intraparietal Sulcus

Abstract: The integration of visual and auditory events is thought to require a joint representation of visual and auditory space in a common reference frame. We investigated the coding of visual and auditory space in the lateral and medial intraparietal areas (LIP, MIP) as a candidate for such a representation. We recorded the activity of 275 neurons in LIP and MIP of two monkeys while they performed saccades to a row of visual and auditory targets from three different eye positions. We found 45% of these neurons to be… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

21
251
4

Year Published

2007
2007
2020
2020

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 208 publications
(277 citation statements)
references
References 90 publications
21
251
4
Order By: Relevance
“…Monkey S had a left hemisphere chamber and showed a median weight of 0.80 with 43% gaze-centered, 17% hand-centered, and 24% intermediate cells. Finally, we applied three previously published classification schemes (7,(23)(24)(25) for distinguishing gaze-and hand-centered frames of reference, and these confirmed our conclusion that PRR shows a broad range of representations, from gaze-centered to hand-centered, with a bias for gaze-centered cells (Fig. 3B, Fig.…”
Section: Resultssupporting
confidence: 80%
See 1 more Smart Citation
“…Monkey S had a left hemisphere chamber and showed a median weight of 0.80 with 43% gaze-centered, 17% hand-centered, and 24% intermediate cells. Finally, we applied three previously published classification schemes (7,(23)(24)(25) for distinguishing gaze-and hand-centered frames of reference, and these confirmed our conclusion that PRR shows a broad range of representations, from gaze-centered to hand-centered, with a bias for gaze-centered cells (Fig. 3B, Fig.…”
Section: Resultssupporting
confidence: 80%
“…For example, cells in the superior colliculus code auditory stimuli using complex representations that are idiosyncratic to individual cells and neither purely gazenor purely head-centered (18,19). Similar complex and nonuniform coding may occur in the ventral intraparietal area (VIP) (20), the dorsal medial superior temporal area (MSTd) (21), the lateral intraparietal area (LIP) (22,23), and PMd (24,25). Modeling work suggests that these complex coding schemes may help convert reference frames, optimally combine sensory information from different modalities, or perform nonlinear computations (15,17,26,27).…”
mentioning
confidence: 99%
“…Transient eccentric eye position (Ͻ5 s) has also been shown to modulate the spatial responses of auditory neurons in the inferior colliculus (Groh et al, 2001;Zwiers et al, 2004), superior colliculus (Jay and Sparks, 1987;Hartline et al, 1995;Peck et al, 1995;Zella et al, 2001;Populin et al, 2004), auditory cortex (WernerReiss et al, 2003;Fu et al, 2004), and the intraparietal sulcus (Stricanne et al, 1996;Mullette-Gillman et al, 2005). These short-term effects may reflect a step in the eye-to-head-centered transformation of sound source coordinates, their associated errors, or an early component of the auditory spatial adaptation reported here.…”
Section: Discussionsupporting
confidence: 50%
“…Thus Spt is defined as a region with the posterior portion of the Sylvian fissure that exhibits both auditory and motor response properties. al., 1996), as well as the observation that sensory input from multiple modalities can drive neurons in PPC sensory-motor fields (Cohen, Batista, & Andersen, 2002;Mullette-Gillman, Cohen, & Groh, 2005). In sum, the response properties of area Spt -particularly that it shows both sensory and motor responses -are consistent with the hypothesis that it is a sensory-motor integration region similar to those found in the primate intraparietal sulcus (Buchsbaum et al, 2005b).…”
Section: Introductionsupporting
confidence: 62%
“…These areas are organized around particular motor-effector systems (Andersen 1997;Colby et al, 1999;Simon et al, 2002). While these areas can take input from multiple sensory modalities Mullette-Gillman et al, 2005;Xing & Andersen, 2000), these systems may nonetheless be biased towards certain sensory modalities depending on the demands of the particular action systems involved. For example, manual action systems may be biased towards visual input in most individuals because of visually-guided reaching/ grasping functions, whereas vocal tract systems may be biased toward auditory input for reasons discussed in the Introduction.…”
Section: Discussionmentioning
confidence: 99%