2020
DOI: 10.1002/hbm.25309
|View full text |Cite
|
Sign up to set email alerts
|

Categorizing human vocal signals depends on an integrated auditory‐frontal cortical network

Abstract: Voice signals are relevant for auditory communication and suggested to be processed in dedicated auditory cortex (AC) regions. While recent reports highlighted an additional role of the inferior frontal cortex (IFC), a detailed description of the integrated functioning of the AC–IFC network and its task relevance for voice processing is missing. Using neuroimaging, we tested sound categorization while human participants either focused on the higher‐order vocal‐sound dimension (voice task) or feature‐based inte… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
1
1

Citation Types

3
12
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
1

Relationship

3
2

Authors

Journals

citations
Cited by 9 publications
(18 citation statements)
references
References 61 publications
(125 reference statements)
3
12
0
Order By: Relevance
“…This individualized peak approach allows targeting those regions on the single‐subject level that are most likely to drive ongoing neural processes in the group level while ensuring that individual regions remain comparable. The approach is in line with prior comparable studies examining task‐related DCM effects (Heim et al, 2009; Kleineberg et al, 2018; Roswandowitz, Swanborough, & Frühholz, 2021).…”
Section: Methodssupporting
confidence: 73%
“…This individualized peak approach allows targeting those regions on the single‐subject level that are most likely to drive ongoing neural processes in the group level while ensuring that individual regions remain comparable. The approach is in line with prior comparable studies examining task‐related DCM effects (Heim et al, 2009; Kleineberg et al, 2018; Roswandowitz, Swanborough, & Frühholz, 2021).…”
Section: Methodssupporting
confidence: 73%
“…For the affective forced choice task, we created blends of vocally expressed anger and fear using a voice morphing procedure (1, 2) on a set of commonly available and validated vocal affective bursts, namely the Montreal affective voice database or 'MAV' (3). From this database we selected five male and five female voices.…”
Section: Resultsmentioning
confidence: 99%
“…Unlike clear affective expressions in voices which represent categorical certainty as predicted by the second hypothesis, ambiguous vocal affect might trigger computations in the IFC given their challenging perceptual and cognitive processing according to the first hypothesis. Both cases might additionally require integrated IFC-STC functioning in terms of neural connectivity (Roswandowitz, Swanborough et al 2020), either for top-down IFC-to-STC facilitations or as co-representations of categorical affective information (Steiner, Bobin et al 2021). The use of transcranial magnetic stimulation (TMS) appears to be a reliable and non-invasive option to especially inhibit a proper functioning of computations performed in the IFC in this context.…”
Section: Introductionmentioning
confidence: 99%
See 2 more Smart Citations