In less than three decades, the concept “cerebellar neurocognition” has evolved from a mere afterthought to an entirely new and multifaceted area of neuroscientific research. A close interplay between three main strands of contemporary neuroscience induced a substantial modification of the traditional view of the cerebellum as a mere coordinator of autonomic and somatic motor functions. Indeed, the wealth of current evidence derived from detailed neuroanatomical investigations, functional neuroimaging studies with healthy subjects and patients and in-depth neuropsychological assessment of patients with cerebellar disorders shows that the cerebellum has a cardinal role to play in affective regulation, cognitive processing, and linguistic function. Although considerable progress has been made in models of cerebellar function, controversy remains regarding the exact role of the “linguistic cerebellum” in a broad variety of nonmotor language processes. This consensus paper brings together a range of different viewpoints and opinions regarding the contribution of the cerebellum to language function. Recent developments and insights in the nonmotor modulatory role of the cerebellum in language and some related disorders will be discussed. The role of the cerebellum in speech and language perception, in motor speech planning including apraxia of speech, in verbal working memory, in phonological and semantic verbal fluency, in syntax processing, in the dynamics of language production, in reading and in writing will be addressed. In addition, the functional topography of the linguistic cerebellum and the contribution of the deep nuclei to linguistic function will be briefly discussed. As such, a framework for debate and discussion will be offered in this consensus paper.
These data provide evidence for two levels of speech motor control bound, most presumably, to motor preparation and execution processes. They also help to explain clinical observations such as an unimpaired or even accelerated speaking rate in Parkinson disease and slowed speech tempo, which does not fall below a rate of 3 Hz, in cerebellar disorders.
In addition to the propositional content of verbal utterances, significant linguistic and emotional information is conveyed by the tone of speech. To differentiate brain regions subserving processing of linguistic and affective aspects of intonation, discrimination of sentences differing in linguistic accentuation and emotional expressiveness was evaluated by functional magnetic resonance imaging. Both tasks yielded rightward lateralization of hemodynamic responses at the level of the dorsolateral frontal cortex as well as bilateral thalamic and temporal activation. Processing of linguistic and affective intonation, thus, seems to be supported by overlapping neural networks comprising partially right-sided brain regions. Comparison of hemodynamic activation during the two different tasks, however, revealed bilateral orbito-frontal responses restricted to the affective condition as opposed to activation of the left lateral inferior frontal gyrus confined to evaluation of linguistic intonation. These findings indicate that distinct frontal regions contribute to higher level processing of intonational information depending on its communicational function. In line with other components of language processing, discrimination of linguistic accentuation seems to be lateralized to the left inferior-lateral frontal region whereas bilateral orbito-frontal areas subserve evaluation of emotional expressiveness.
Rapid syllable repetitions require alternating articulatory movements and, thus, provide a test for oral diadochokinesis. The present study performed an acoustic analysis of rapid syllable repetitions in patients suffering from idiopathic Parkinson’s disease (n = 17), Huntington’s chorea (n = 14), Friedreich’s ataxia (n = 9), or from a purely cerebellar syndrome (n = 13). Four parameters were considered: the mean number of syllables per train, the median syllable duration with its variation coefficient, and articulatory imprecision in terms of the percentage of incomplete closures. Apart from a few subjects with minor motor deficits only, in all patients at least one of the four measures of diadochokinesis exceeded the normal range. Accordingly, discriminant analysis revealed a highly significant difference between controls and patients with respect to the considered parameters. Thus, oral diadochokinesis tasks represent a sensitive measure of orofacial motor impairment. Moreover, multivariate analysis showed that Parkinson’s disease and Friedreich’s ataxia are characterized by a highly specific profile of diadochokinesis performance.
BackgroundIndividuals suffering from vision loss of a peripheral origin may learn to understand spoken language at a rate of up to about 22 syllables (syl) per second - exceeding by far the maximum performance level of normal-sighted listeners (ca. 8 syl/s). To further elucidate the brain mechanisms underlying this extraordinary skill, functional magnetic resonance imaging (fMRI) was performed in blind subjects of varying ultra-fast speech comprehension capabilities and sighted individuals while listening to sentence utterances of a moderately fast (8 syl/s) or ultra-fast (16 syl/s) syllabic rate.ResultsBesides left inferior frontal gyrus (IFG), bilateral posterior superior temporal sulcus (pSTS) and left supplementary motor area (SMA), blind people highly proficient in ultra-fast speech perception showed significant hemodynamic activation of right-hemispheric primary visual cortex (V1), contralateral fusiform gyrus (FG), and bilateral pulvinar (Pv).ConclusionsPresumably, FG supports the left-hemispheric perisylvian “language network”, i.e., IFG and superior temporal lobe, during the (segmental) sequencing of verbal utterances whereas the collaboration of bilateral pulvinar, right auditory cortex, and ipsilateral V1 implements a signal-driven timing mechanism related to syllabic (suprasegmental) modulation of the speech signal. These data structures, conveyed via left SMA to the perisylvian “language zones”, might facilitate – under time-critical conditions – the consolidation of linguistic information at the level of verbal working memory.
During speech perception, acoustic correlates of syllable structure and pitch periodicity are directly reflected in electrophysiological brain activity. Magnetoencephalography (MEG) recordings were made while 10 participants listened to natural or formant-synthesized speech at moderately fast or ultrafast rate. Cross-correlation analysis was applied to show brain activity time-locked to the speech envelope, to an acoustic marker of syllable onsets, and to pitch periodicity. The envelope yielded a right-lateralized M100-like response, syllable onsets gave rise to M50/M100-like fields with an additional anterior M50 component, and pitch (ca. 100 Hz) elicited a neural resonance bound to a central auditory source at a latency of 30 ms. The strength of these MEG components showed differential effects of syllable rate and natural versus synthetic speech. Presumingly, such phase-locking mechanisms serve as neuronal triggers for the extraction of information-bearing elements.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.