2016
DOI: 10.1016/j.cub.2015.12.056
|View full text |Cite
|
Sign up to set email alerts
|

Distinct Computational Principles Govern Multisensory Integration in Primary Sensory and Association Cortices

Abstract: Human observers typically integrate sensory signals in a statistically optimal fashion into a coherent percept by weighting them in proportion to their reliabilities. An emerging debate in neuroscience is to which extent multisensory integration emerges already in primary sensory areas or is deferred to higher-order association areas. This fMRI study used multivariate pattern decoding to characterize the computational principles that define how auditory and visual signals are integrated into spatial representa… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3
2

Citation Types

18
157
4
1

Year Published

2017
2017
2024
2024

Publication Types

Select...
5
2

Relationship

0
7

Authors

Journals

citations
Cited by 141 publications
(180 citation statements)
references
References 38 publications
18
157
4
1
Order By: Relevance
“…At the neural level, this is in line with studies of visual perceptual learning that observed changes in activity patterns in the anterior cingulate cortex to track changes in decision-making during visual perceptual learning [34, 37]. Furthermore, neural evidence suggests that prediction error signals during perceptual learning refine and strengthen neural connectivity between sensory neurons and those neurons required for the perceptual response and thus may support changes in higher-order regions [58]. Thus, in the absence of an informative reinforcement signal, rapid but transient changes in perceptual plasticity are likely due to changes in low-level sensory areas.…”
Section: Discussionsupporting
confidence: 79%
See 1 more Smart Citation
“…At the neural level, this is in line with studies of visual perceptual learning that observed changes in activity patterns in the anterior cingulate cortex to track changes in decision-making during visual perceptual learning [34, 37]. Furthermore, neural evidence suggests that prediction error signals during perceptual learning refine and strengthen neural connectivity between sensory neurons and those neurons required for the perceptual response and thus may support changes in higher-order regions [58]. Thus, in the absence of an informative reinforcement signal, rapid but transient changes in perceptual plasticity are likely due to changes in low-level sensory areas.…”
Section: Discussionsupporting
confidence: 79%
“…Thus, in the absence of an informative reinforcement signal, rapid but transient changes in perceptual plasticity are likely due to changes in low-level sensory areas. Future investigations will be necessary to determine if changes in the connectivity of higher-order cortical areas and low-level sensory processes underlie the observed changes in temporal recalibration and if these changes are durable or transient (see [58] for a helpful review in this regard).…”
Section: Discussionmentioning
confidence: 99%
“…However, recent studies suggest that there may be no generic answer to this question, as multisensory processing likely involves a distributed set of task- and function-specific regions (Bizley et al, 2016, Werner and Noppeney, 2010). In line with this hypothesis, two recent fMRI studies have illustrated how the computational nature of Audio-visual interactions changes from low-level sensory to high-level parietal cortices (Rohe and Noppeney, 2014, Rohe and Noppeney, 2016). …”
Section: Discussionmentioning
confidence: 79%
“…While there is an emerging consensus that the underlying neural correlates likely involve multiple stages of the sensory decision making pathways, it remains a challenge to uncover the dynamic processes that implement the multisensory benefit for an upcoming decision in the human brain (Bizley et al, 2016, Kayser and Shams, 2015, Rohe and Noppeney, 2014, Rohe and Noppeney, 2016). For example, many studies have shown that judgements about visual motion can be influenced by simultaneous sounds (Alais and Burr, 2004, Beer and Roder, 2004, Lewis and Noppeney, 2010, Schmiedchen et al, 2012) or vestibular information (Fetsch et al, 2010, Gu et al, 2008), even so when the multisensory stimulus is not directly task relevant (Gleiss and Kayser, 2014b, Kim et al, 2012, Sekuler et al, 1997).…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation