Abstract:Excitability of articulatory motor cortex is facilitated when listening to speech in challenging conditions. Beyond this, however, we have little knowledge of what listener-specific and speech-specific factors engage articulatory facilitation during speech perception. For example, it is unknown whether speech motor activity is independent or dependent on the form of distortion in the speech signal. It is also unknown if speech motor facilitation is moderated by hearing ability. We investigated these questions … Show more
“…This is in keeping with previous reports of behavioural changes post-TMS, which predominantly manifest in a change in response time (Devlin, Matthews, & Rushworth, 2003;Krieger-Redwood, Gaskell, Lindsay, & Jefferies, 2013;Pobric, Jefferies, & Ralph, 2007;Whitney, Kirk, O'Sullivan, Lambon Ralph, & Jefferies, 2011). Surprisingly, MEPs were not modulated by distorted sentences, despite this form of distortion modulating MEPs to pre-lexical stimuli (Nuttall, Kennedy-Higgins, Devlin, & Adank, 2017;Nuttall et al, 2016). MEPs were not affected by cTBS when considered at group level.…”
Primary motor (M1) areas for speech production activate during speechperception. It has been suggested that such activation may be dependent upon modulatory inputs from premotor cortex (PMv). If and how PMv differentially modulates M1 activity during perception of speech that is easy or challenging to understand, however, is unclear. This study aimed to test the link between PMv and M1 during challenging speech perception in two experiments. The first experiment investigated intra-hemispheric connectivity between left hemisphere PMv and left M1 lip area during comprehension of speech under clear and distorted listening conditions. Continuous theta burst stimulation (cTBS) was applied to left PMv in eighteen participants (aged 18-35). Post-cTBS, participants performed a sentence verification task on distorted (imprecisely articulated), and clear speech, whilst also undergoing stimulation of the lip representation in the left M1 to elicit motor evoked potentials (MEPs). In a second, separate experiment, we investigated the role of inter-hemispheric connectivity between right hemisphere PMv and left hemisphere M1 lip area. Dual-coil transcranial magnetic stimulation was applied to right PMv and left M1 lip in fifteen participants (aged 18-35). Results indicated that disruption of PMv during speech perception affects comprehension of distorted speech specifically. Furthermore, our data suggest that listening to distorted speech modulates the balance of intra- and inter-hemispheric interactions, with a larger sensorimotor network implicated during comprehension of distorted speech than when speech perception is optimal. The present results further understanding of PMv-M1 interactions during auditory-motor integration.
“…This is in keeping with previous reports of behavioural changes post-TMS, which predominantly manifest in a change in response time (Devlin, Matthews, & Rushworth, 2003;Krieger-Redwood, Gaskell, Lindsay, & Jefferies, 2013;Pobric, Jefferies, & Ralph, 2007;Whitney, Kirk, O'Sullivan, Lambon Ralph, & Jefferies, 2011). Surprisingly, MEPs were not modulated by distorted sentences, despite this form of distortion modulating MEPs to pre-lexical stimuli (Nuttall, Kennedy-Higgins, Devlin, & Adank, 2017;Nuttall et al, 2016). MEPs were not affected by cTBS when considered at group level.…”
Primary motor (M1) areas for speech production activate during speechperception. It has been suggested that such activation may be dependent upon modulatory inputs from premotor cortex (PMv). If and how PMv differentially modulates M1 activity during perception of speech that is easy or challenging to understand, however, is unclear. This study aimed to test the link between PMv and M1 during challenging speech perception in two experiments. The first experiment investigated intra-hemispheric connectivity between left hemisphere PMv and left M1 lip area during comprehension of speech under clear and distorted listening conditions. Continuous theta burst stimulation (cTBS) was applied to left PMv in eighteen participants (aged 18-35). Post-cTBS, participants performed a sentence verification task on distorted (imprecisely articulated), and clear speech, whilst also undergoing stimulation of the lip representation in the left M1 to elicit motor evoked potentials (MEPs). In a second, separate experiment, we investigated the role of inter-hemispheric connectivity between right hemisphere PMv and left hemisphere M1 lip area. Dual-coil transcranial magnetic stimulation was applied to right PMv and left M1 lip in fifteen participants (aged 18-35). Results indicated that disruption of PMv during speech perception affects comprehension of distorted speech specifically. Furthermore, our data suggest that listening to distorted speech modulates the balance of intra- and inter-hemispheric interactions, with a larger sensorimotor network implicated during comprehension of distorted speech than when speech perception is optimal. The present results further understanding of PMv-M1 interactions during auditory-motor integration.
“…Based on research into the relationship between action and language, the view that both phonetic and semantic processing shares cognitive and neural resources with the sensorimotor system has gained increasing acceptance (Barsalou, ; Fischer & Zwaan, ; Willems & Hagoort, ). For example, comprehending the meanings of words semantically related to body parts (e.g., eat , mouth ) activates brain regions also engaged in moving these body parts(Andrews, Frank, & Vigliocco, ; Meteyard, Cuadrado, Bahrami, & Vigliocco, ; Pulvermüller, ), just like decoding sounds recruits regions responsible for the production of these sounds (Nuttall, Kennedy‐Higgins, Devlin, & Adank, ; Skipper, Devlin, & Lametti, ).…”
Section: Hierarchical Structure In Language and Actionmentioning
This review compares how humans process action and language sequences produced by other humans. On the one hand, we identify commonalities between action and language processing in terms of cognitive mechanisms (e.g., perceptual segmentation, predictive processing, integration across multiple temporal scales), neural resources (e.g., the left inferior frontal cortex), and processing algorithms (e.g., comprehension based on changes in signal entropy). On the other hand, drawing on sign language with its particularly strong motor component, we also highlight what differentiates (both oral and signed) linguistic communication from nonlinguistic action sequences. We propose the multiscale information transfer framework (MSIT) as a way of integrating these insights and highlight directions into which future empirical research inspired by the MSIT framework might fruitfully evolve.
This article is categorized under:
Psychology > Language
Linguistics > Language in Mind and Brain
Psychology > Motor Skill and Performance
Psychology > Prediction
“…Changes in the size of MEPs reflect changes in the excitability of the motor pathways connecting the cortical representations with the corresponding muscles. Using this technique, several studies have demonstrated that the excitability of the primary motor cortex, which controls articulatory gestures to produce speech, is enhanced during listening to speech ( Fadiga et al., 2002 , Murakami et al., 2011 , Murakami et al., 2013 , Nuttall et al., 2017 , Nuttall et al., 2016 , Watkins et al., 2003 ).…”
Comprehending speech can be particularly challenging in a noisy environment and in the absence of semantic context. It has been proposed that the articulatory motor system would be recruited especially in difficult listening conditions. However, it remains unknown how signal-to-noise ratio (SNR) and semantic context affect the recruitment of the articulatory motor system when listening to continuous speech. The aim of the present study was to address the hypothesis that involvement of the articulatory motor cortex increases when the intelligibility and clarity of the spoken sentences decreases, because of noise and the lack of semantic context. We applied Transcranial Magnetic Stimulation (TMS) to the lip and hand representations in the primary motor cortex and measured motor evoked potentials from the lip and hand muscles, respectively, to evaluate motor excitability when young adults listened to sentences. In Experiment 1, we found that the excitability of the lip motor cortex was facilitated during listening to both semantically anomalous and coherent sentences in noise relative to non-speech baselines, but neither SNR nor semantic context modulated the facilitation. In Experiment 2, we replicated these findings and found no difference in the excitability of the lip motor cortex between sentences in noise and clear sentences without noise. Thus, our results show that the articulatory motor cortex is involved in speech processing even in optimal and ecologically valid listening conditions and that its involvement is not modulated by the intelligibility and clarity of speech.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.