2018
DOI: 10.1523/jneurosci.3650-17.2018
|View full text |Cite
|
Sign up to set email alerts
|

The Motor Network Reduces Multisensory Illusory Perception

Abstract: Observing mouth movements has strikingly effects on the perception of speech. Any mismatch between sound and mouth movements will result in listeners perceiving illusory consonants (McGurk effect), whereas matching mouth movements assist with the correct recognition of speech sounds. Recent neuroimaging studies have yielded evidence that the motor areas are involved in speech processing, yet their contributions to multisensory illusion remain unclear. Using functional magnetic resonance imaging (fMRI) and tran… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

3
23
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
6
2

Relationship

0
8

Authors

Journals

citations
Cited by 13 publications
(26 citation statements)
references
References 44 publications
(36 reference statements)
3
23
0
Order By: Relevance
“…Several previous studies have tied the IFG to AV fusion but its specific contribution is still unclear 15,26,59,60 . The current results suggest that the early IFG involvement in AV integration does not relate to feature identification 61 , but are specific to AV timing. We found that the IFG activity tracked temporal AV asynchrony, at least within the range used in the experiment (from -120 ms auditory lead to 320 ms auditory lag).…”
Section: The Inferior Frontal Gyrus Tracks Av Temporal Asynchronymentioning
confidence: 52%
“…Several previous studies have tied the IFG to AV fusion but its specific contribution is still unclear 15,26,59,60 . The current results suggest that the early IFG involvement in AV integration does not relate to feature identification 61 , but are specific to AV timing. We found that the IFG activity tracked temporal AV asynchrony, at least within the range used in the experiment (from -120 ms auditory lead to 320 ms auditory lag).…”
Section: The Inferior Frontal Gyrus Tracks Av Temporal Asynchronymentioning
confidence: 52%
“…Indeed, unimodal neural entrainment to speech envelope or mouth movements is driven by neural sources located in the left motor and premotor cortex (Park et al, 2015;Park et al, 2018). A precentral origin of these top-down modulations is also suggested by the fact that corticobulbar excitability is modulated by passive listening to speech (Fadiga et al, 2002;Watkins et al, 2003;D'Ausilio et al, 2014;Schmitz et al, 2018) and that transient perturbation of the activity in premotor and motor areas produces somatotopically organized modulation of speech discrimination performance (D'Ausilio et al, 2009;Mottonen et al, 2009;Sato et al, 2009;D'Ausilio et al, 2012;Bartoli et al, 2015;Murakami et al, 2018).…”
Section: Discussionmentioning
confidence: 99%
“…For correlation analyses of brain connectivity with McGurk illusion rates, the same auditory seed was used as well as two other areas related to audio-visual integration, the left motor lip area and the left STS. The seed coordinates for the lip area were based on the average individual activation peaks in response to the McGurk task from Murakami and associates 41 . For the seed in the left STS, coordinates from the same study, used in an effective connectivity measure, were adopted.…”
Section: Data Acquisition Imaging Data Were Acquired On a 3 T Whole-mentioning
confidence: 99%
“…Early auditory and visual areas are involved in the generation of the fused McGurk percept in healthy participants 38 . Additionally, areas crucial for speech production, including the lip area of the primary motor cortex, have been found to be involved in the perception of audio-visual speech stimuli [39][40][41][42] . Thereby, bottom-up sensory input is believed to be integrated with top-down information from frontal and motor areas in the posterior superior temporal sulcus (STS) 38,43,44 .…”
mentioning
confidence: 99%
See 1 more Smart Citation