2005
DOI: 10.1016/j.neuroimage.2004.09.039
|View full text |Cite
|
Sign up to set email alerts
|

Neural correlates of switching from auditory to speech perception

Abstract: Many people exposed to sinewave analogues of speech first report hearing them as electronic glissando and, later, when they switch into a dspeech modeT, hearing them as syllables. This perceptual switch modifies their discrimination abilities, enhancing perception of differences that cross phonemic boundaries while diminishing perception of differences within phonemic categories. Using high-density evoked potentials and fMRI in a discrimination paradigm, we studied the changes in brain activity that are relate… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

27
158
0
1

Year Published

2006
2006
2022
2022

Publication Types

Select...
5
3
2

Relationship

0
10

Authors

Journals

citations
Cited by 233 publications
(186 citation statements)
references
References 61 publications
27
158
0
1
Order By: Relevance
“…Thus, the above dissociations that we find in performance with the two stimulus types may be due not only to the fact that one is speech and the other not, but they may also be due to stimulus complexity. In support of the idea that different learning mechanisms exist for speech and non-speech sounds, there exists evidence from other studies for speech-specific neural mechanisms when stimulus complexity is controlled (Liebenthal, Binder, Spitzer, Possing, & Medler, 2005;Scott, Blank, Rosen, & Wise, 2000;Scott, Rosen, Lang, & Wise, 2006), or when the same stimuli are first perceived as nonspeech and then later as speech (Dehaene-Lambertz et al, 2005;Dufor, Serniclaes, Sprenger-Charolles, & Démonet, 2007). More generally, it is likely that the neural mechanisms underlying the processing of abstract, linguistically relevant properties versus of the underlying acoustic characteristics of stimuli interact in a complex and non-exclusive manner, and that they depend on linguistic experience as well as on neural top-down processing mechanisms which interact with afferent pathways which carry stimulus information (Zatorre & Gandour, 2007).…”
Section: Discussionmentioning
confidence: 84%
“…Thus, the above dissociations that we find in performance with the two stimulus types may be due not only to the fact that one is speech and the other not, but they may also be due to stimulus complexity. In support of the idea that different learning mechanisms exist for speech and non-speech sounds, there exists evidence from other studies for speech-specific neural mechanisms when stimulus complexity is controlled (Liebenthal, Binder, Spitzer, Possing, & Medler, 2005;Scott, Blank, Rosen, & Wise, 2000;Scott, Rosen, Lang, & Wise, 2006), or when the same stimuli are first perceived as nonspeech and then later as speech (Dehaene-Lambertz et al, 2005;Dufor, Serniclaes, Sprenger-Charolles, & Démonet, 2007). More generally, it is likely that the neural mechanisms underlying the processing of abstract, linguistically relevant properties versus of the underlying acoustic characteristics of stimuli interact in a complex and non-exclusive manner, and that they depend on linguistic experience as well as on neural top-down processing mechanisms which interact with afferent pathways which carry stimulus information (Zatorre & Gandour, 2007).…”
Section: Discussionmentioning
confidence: 84%
“…The left posterior superior temporal cortex shows greater responses to speech than to nonspeech sounds (Binder et al, 2000). Furthermore, sine-wave analogues of sounds elicit stronger activity in this region when they are perceived as speech (i.e., phonetically and categorically) than when they are perceived as nonspeech (i.e., nonphonetically and continuously) (Dehaene-Lambertz et al, 2005;Möttönen et al, 2006;Desai et al, 2008). The supramarginal gyrus is also thought be involved in categorical processing of speech sounds (Raizada and Poldrack, 2007).…”
Section: Discussionmentioning
confidence: 99%
“…Several recent studies comparing phonetic sounds to acoustically matched nonphonetic sounds (Dehaene-Lambertz et al, 2005;Liebenthal et al, 2005;Mottonen et al, 2006) or to noise (Binder et al, 2000;Rimol et al, 2005) have shown activation specifically in this brain region. Two factors might explain these discordant findings.…”
Section: Processing Of Speech Compared To Unfamiliar Rotated Speech Smentioning
confidence: 99%