2005
DOI: 10.1093/cercor/bhi123
|View full text |Cite
|
Sign up to set email alerts
|

Two Distinct Neural Mechanisms for Category-selective Responses

Abstract: The cognitive and neural mechanisms mediating category-selective responses in the human brain remain controversial. Using functional magnetic resonance imaging and effective connectivity analyses (Dynamic Causal Modelling), we investigated animal- and tool-selective responses by manipulating stimulus modality (pictures versus words) and task (implicit versus explicit semantic). We dissociated two distinct mechanisms that engender category selectivity: in the ventral occipito-temporal cortex, tool-selective res… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1

Citation Types

25
155
0
4

Year Published

2008
2008
2019
2019

Publication Types

Select...
8
1

Relationship

1
8

Authors

Journals

citations
Cited by 183 publications
(186 citation statements)
references
References 41 publications
(40 reference statements)
25
155
0
4
Order By: Relevance
“…2 & 3). Previous studies have reported that viewing both the picture and word of tools commonly activated the left aIPS (Noppeney et al 2006) and the IPL but with a greater activation response to the pictures than to the words (Boronat et al 5 2005) without pantomiming or actual tool use. These results suggest that the left aIPS revealed by our cross-classification could also reflect the response to viewing of pictures or letters of the tools.…”
Section: Discussionmentioning
confidence: 93%
See 1 more Smart Citation
“…2 & 3). Previous studies have reported that viewing both the picture and word of tools commonly activated the left aIPS (Noppeney et al 2006) and the IPL but with a greater activation response to the pictures than to the words (Boronat et al 5 2005) without pantomiming or actual tool use. These results suggest that the left aIPS revealed by our cross-classification could also reflect the response to viewing of pictures or letters of the tools.…”
Section: Discussionmentioning
confidence: 93%
“…Some of these studies used the movement of both right and left hands, and they revealed that the left-lateralized PPC and premotor network are activated for the pantomiming of both dominant and non-dominant hands (Moll et al 2000;Johnson and Grafton 2003;Johnson-Frey et al 2005). Viewing or naming of tools without motor execution also activates the left PPC, mainly in the IPL, and premotor cortex, together with the regions representing specific object categories in the temporal cortex (Chao et al 1999;Chao and Martin 2000;Noppeney et al 2006;Mahon et al 2007;Mruczek et al 2013;Peeters et al 2013). The IPL has been repeatedly shown to be integral to the knowledge on 5 skilled tool-use or object manipulation (Kellenbach et al 2003;Boronat et al 2005;Ishibashi et al 2011) with distinct functional connectivities with the premotor and temporal regions (Garcea and Mahon 2014).…”
Section: Introductionmentioning
confidence: 99%
“…Various gradients in patterns of connectivity and function have been reported within these three regions (Petrides, 1987, 1991; Wilson et al, 1993; Ó Scalaidhe et al, 1997; Hirsch et al, 2001; Denys et al, 2004; Barbas et al, 2005; Hagler and Sereno, 2006; Nelissen et al, 2005; Noppeney et al, 2005). However, there are no standardized quantitative data on pyramidal cell structure within these different gradients in the gPFC.…”
Section: Introductionmentioning
confidence: 99%
“…The neuronal dynamics are described by the differential equations describing the dynamics of a single state that summarizes the neuronal or synaptic activity of each area; this activity then induces a hemodynamic response as described by an extended Balloon model (Buxton et al, 1998). Examples of DCM for fMRI can be found in Mechelli et al (2004), Noppeney et al (2006), Stephan et al (2005), and Griffiths et al (2007) (for a review on the conceptual basis of DCM and its implementation for functional magnetic resonance imaging data and event-related potentials, see Stephan et al, 2007).…”
Section: Introductionmentioning
confidence: 99%