2016
DOI: 10.3389/fnhum.2016.00518
|View full text |Cite
|
Sign up to set email alerts
|

Recruitment of Language-, Emotion- and Speech-Timing Associated Brain Regions for Expressing Emotional Prosody: Investigation of Functional Neuroanatomy with fMRI

Abstract: We aimed to progress understanding of prosodic emotion expression by establishing brain regions active when expressing specific emotions, those activated irrespective of the target emotion, and those whose activation intensity varied depending on individual performance. BOLD contrast data were acquired whilst participants spoke non-sense words in happy, angry or neutral tones, or performed jaw-movements. Emotion-specific analyses demonstrated that when expressing angry prosody, activated brain regions included… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
8
0

Year Published

2018
2018
2023
2023

Publication Types

Select...
4
1
1

Relationship

0
6

Authors

Journals

citations
Cited by 10 publications
(8 citation statements)
references
References 138 publications
0
8
0
Order By: Relevance
“…As already mentioned when the temporal pole and the uncinate fasciculus were addressed, left orbitofrontal cortex plays a role in emotional-semantic processing, particularly when emotions have to be perceived in a cognitive task (Ethofer et al, 2006). By contrast, the production of emotional speech seems to be served by the anterior cingulate cortex and subcortical mechanisms through the basal ganglia (Mitchell et al, 2016). Apart from emotion processing, orbitofrontal cortex comprises an area for secondary olfactory representations which is activated by lexical items that semantically refer to olfactory sensation (Pomp et al, 2018).…”
Section: Orbitofrontal Cortex and Emotion Processingmentioning
confidence: 96%
See 1 more Smart Citation
“…As already mentioned when the temporal pole and the uncinate fasciculus were addressed, left orbitofrontal cortex plays a role in emotional-semantic processing, particularly when emotions have to be perceived in a cognitive task (Ethofer et al, 2006). By contrast, the production of emotional speech seems to be served by the anterior cingulate cortex and subcortical mechanisms through the basal ganglia (Mitchell et al, 2016). Apart from emotion processing, orbitofrontal cortex comprises an area for secondary olfactory representations which is activated by lexical items that semantically refer to olfactory sensation (Pomp et al, 2018).…”
Section: Orbitofrontal Cortex and Emotion Processingmentioning
confidence: 96%
“…However, the detailed pathomechanisms of subcortical language symptoms are difficult to assess, and in many cases additional cortical or white matter lesions cannot strictly be ruled out (Radanovic and Mansur, 2017). Nevertheless, fMRI studies on language processing have shown that the basal ganglia are involved in a variety of specific language-related tasks such as perceptual category learning (Lim et al, 2014), the management of word categories (Bonhage et al, 2015), emotional prosody (Mitchell et al, 2016), ambiguity resolution (Ketteler et al, 2008), control of speaking rate (Riecker et al, 2005), and control of infant-directed speech (Matsuda et al, 2014).…”
Section: Subcortical Circuitsmentioning
confidence: 99%
“…Dorsal mPFC is structurally connected with premotor and somatosensory areas (Öngür et al 2003), and has been associated with own choice execution (Nicolle et al 2012) and representing goal oriented social schemata (Krueger et al 2009). The basal ganglia are part of a subcortical network involved in emotional prosody production (Aziz-Zadeh et al 2010;Laukka et al 2011;Pichon and Kell 2013;Frühholz et al 2015;Mitchell et al 2016) and are thought to have regulatory functional connectivity with the amygdala, motor and auditory cortices during affective vocal control (Pichon and Kell 2013;Frühholz et al 2015;Klaas et al 2015).…”
Section: Linking Social Information With Motor Planningmentioning
confidence: 99%
“…The expression of such affective vocalizations has been proposed to rely on the interaction of a dual-pathway system consisting of the neocortical regions of the VMN and a phylogenetically older network of subcortical brain structures such as the basal ganglia and the amygdala (Ackermann et al 2014;Hage and Nieder 2016). In line with this, voluntary affective vocal expression engages both vocomotor areas related to volitional expression as well as areas related to processing affect, such as the IFG, BG, ACC and STC and Amygdala (Barrett et al 2004;Aziz-Zadeh et al 2010;Laukka et al 2011;Pichon and Kell 2013;Frühholz et al 2015;Klaas et al 2015;Belyk and Brown 2016;Mitchell et al 2016;Klasen et al 2018). This interplay of affect processing streams and the vocomotor network therefore suggests that some informational integration, is necessary to achieve the successful expression of affect in the voice.…”
mentioning
confidence: 99%
“…To cope with this complexity, and recognise the right prosody, the human brain needs to mix acoustic information and previous knowledge [2]. Thus, our first step was defining a conceptual model that formalises such knowledge as a set of factors affecting prosody emission.…”
Section: Introductionmentioning
confidence: 99%