2005
DOI: 10.1016/j.neuroimage.2004.10.034
|View full text |Cite
|
Sign up to set email alerts
|

Identification of emotional intonation evaluated by fMRI

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

23
219
1
1

Year Published

2006
2006
2015
2015

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 299 publications
(252 citation statements)
references
References 66 publications
23
219
1
1
Order By: Relevance
“…2011), in line with a general suggestion of abstract internal representations that constrain the analysis of subsequent speech inputs (Wildgruber et al. 2005). Taken together these findings, in conjunction with earlier reports on the involvement of these areas, indicate that cortical audio‐visual integration begins at an early stage of processing in the auditory cortex, superior parietal cortex and the middle temporal gyrus, mostly in the left hemisphere.…”
Section: Discussionmentioning
confidence: 99%
See 3 more Smart Citations
“…2011), in line with a general suggestion of abstract internal representations that constrain the analysis of subsequent speech inputs (Wildgruber et al. 2005). Taken together these findings, in conjunction with earlier reports on the involvement of these areas, indicate that cortical audio‐visual integration begins at an early stage of processing in the auditory cortex, superior parietal cortex and the middle temporal gyrus, mostly in the left hemisphere.…”
Section: Discussionmentioning
confidence: 99%
“…In addition, electroencephalographic findings showed audio‐visual interaction that speeds up cortical processing of auditory signals within 100 msec of signal onset (Wildgruber et al. 2005). Interestingly, electrophysiological studies on early (<200 msec) audio‐visual speech interactions in auditory cortex indicate a probable role of sensory, attentional and task‐related factors in modulating these early interactions (Besle et al.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations
“…Consensus suggests a right hemisphere advantage for emotional (non-semantic) aspects of language and prosody (Mitchell and Crow, 2005), particularly in the right STS and MTG, whereas more semantic aspects of language recruit more left-sided regions (Mitchell et al, 2003). This hemispheric pattern has been seen using fMRI (Mitchell et al, 2003) (Wildgruber et al, 2005), transcranial Doppler ultrasonography (Vingerhoets et al, 2003), repetitive transcranial magnetic stimulation (van Rijin et al, 2005), event-related potential (Eckstein and Friederici, 2005), and in lesion studies (Pell, 2006). Therefore, it is possible, that the tone processing impairment on the right in the SPD subjects is related to non-semantic aspects of their odd speech.…”
Section: Discussionmentioning
confidence: 90%