The human language faculty has been claimed to be grounded in the ability to process hierarchically structured sequences. This human ability goes beyond the capacity to process sequences with simple transitional probabilities of adjacent elements observable in non-human primates. Here we show that the processing of these two sequence types is supported by different areas in the human brain. Processing of local transitions is subserved by the left frontal operculum, a region that is phylogenetically older than Broca's area, which specifically holds responsible the computation of hierarchical dependencies. Tractography data revealing differential structural connectivity signatures for these two brain areas provide additional evidence for a segregation of two areas in the left inferior frontal cortex.
In contrast to simple structures in animal vocal behavior, hierarchical structures such as center-embedded sentences manifest the core computational faculty of human language. Previous artificial grammar learning studies found that the left pars opercularis (LPO) subserves the processing of hierarchical structures. However, it is not clear whether this area is activated by the structural complexity per se or by the increased memory load entailed in processing hierarchical structures. To dissociate the effect of structural complexity from the effect of memory cost, we conducted a functional magnetic resonance imaging study of German sentence processing with a 2-way factorial design tapping structural complexity (with/without hierarchical structure, i.e., center-embedding of clauses) and working memory load (long/short distance between syntactically dependent elements; i.e., subject nouns and their respective verbs). Functional imaging data revealed that the processes for structure and memory operate separately but co-operatively in the left inferior frontal gyrus; activities in the LPO increased as a function of structural complexity, whereas activities in the left inferior frontal sulcus (LIFS) were modulated by the distance over which the syntactic information had to be transferred. Diffusion tensor imaging showed that these 2 regions were interconnected through white matter fibers. Moreover, functional coupling between the 2 regions was found to increase during the processing of complex, hierarchically structured sentences. These results suggest a neuroanatomical segregation of syntax-related aspects represented in the LPO from memory-related aspects reflected in the LIFS, which are, however, highly interconnected functionally and anatomically.DTI ͉ fMRI ͉ hierarchical structure L anguage appears to be a trait specific to humans-at least in its core computational component, that is, grammar. Defining language as a sequence of symbols, Chomsky (1) proposed a hierarchy of grammars as language production mechanisms with increasing generative powers. The lowest-level grammar is finite state grammar (FSG). FSG can be fully specified by transition probabilities between a finite number of states (e.g., words), being not powerful enough to generate structures of natural human languages. Phrase structure grammar (PSG) has more generative power than FSG. A key difference between FSG and PSG is that only PSG can generate the sequence A n B n , where A and B denote symbols and n the number of repetitions. The ability to process the sequence A n B n is crucial for the processing of center-embedded sentences, such as ''The man the boy the dog bit greeted is my friend.'' where subjects (i.e., the man, the boy, and the dog) are A-symbols and the verbs (bit, greeted, and is) are B-symbols. Surprisingly, tests on monkeys (2) and on songbirds (3) showed that whereas songbirds can process A n B n sequences, monkeys cannot. However, even if the birds could correctly discriminate A n B n sequences from A n B m , (4 Ͼ n, m Ͼ0, n m),...
Speech is an important carrier of emotional information. However, little is known about how different vocal emotion expressions are recognized in a receiver's brain. We used multivariate pattern analysis of functional magnetic resonance imaging data to investigate to which degree distinct vocal emotion expressions are represented in the receiver's local brain activity patterns. Specific vocal emotion expressions are encoded in a right fronto-operculo-temporal network involving temporal regions known to subserve suprasegmental acoustic processes and a fronto-opercular region known to support emotional evaluation, and, moreover, in left temporo-cerebellar regions covering sequential processes. The right inferior frontal region, in particular, was found to differentiate distinct emotional expressions. The present analysis reveals vocal emotion to be encoded in a shared cortical network reflected by distinct brain activity patterns. These results shed new light on theoretical and empirical controversies about the perception of distinct vocal emotion expressions at the level of large-scale human brain signals.
The study investigates to what extent the posterior superior temporal cortex is involved in processing complex sentences. Using functional MRI, we show that hierarchically structured sentences activate the superior temporal cortex bilaterally to greater extent than sentences with a linear structure. The activation in the left hemisphere comprises the superior temporal gyrus and sulcus, whereas the activation in the right hemisphere is confined to the superior temporal sulcus. As earlier studies using similar syntactic structures in semantic-free grammars did not show activation in the superior temporal cortex but instead only in the prefrontal cortex, we conclude that the role of the posterior superior temporal cortex is to integrate lexical-semantic and syntactic information during sentence comprehension.
In this paper, we present two novel perspectives on the function of the left inferior frontal gyrus (LIFG). First, a structured sequence processing perspective facilitates the search for functional segregation within the LIFG and provides a way to express common aspects across cognitive domains including language, music and action. Converging evidence from functional magnetic resonance imaging and transcranial magnetic stimulation studies suggests that the LIFG is engaged in sequential processing in artificial grammar learning, independently of particular stimulus features of the elements (whether letters, syllables or shapes are used to build up sequences). The LIFG has been repeatedly linked to processing of artificial grammars across all different grammars tested, whether they include non-adjacent dependencies or mere adjacent dependencies. Second, we apply the sequence processing perspective to understand how the functional segregation of semantics, syntax and phonology in the LIFG can be integrated in the general organization of the lateral prefrontal cortex (PFC). Recently, it was proposed that the functional organization of the lateral PFC follows a rostro-caudal gradient, such that more abstract processing in cognitive control is subserved by more rostral regions of the lateral PFC. We explore the literature from the viewpoint that functional segregation within the LIFG can be embedded in a general rostro-caudal abstraction gradient in the lateral PFC. If the lateral PFC follows a rostro-caudal abstraction gradient, then this predicts that the LIFG follows the same principles, but this prediction has not yet been tested or explored in the LIFG literature. Integration might provide further insights into the functional architecture of the LIFG and the lateral PFC.
The ability to process center-embedded structures has been claimed to represent a core function of the language faculty. Recently, several studies have investigated the learning of center-embedded dependencies in artificial grammar settings. Yet some of the results seem to question the learnability of these structures in artificial grammar tasks. Here, we tested under which exposure conditions learning of center-embedded structures in an artificial grammar is possible. We used naturally spoken syllable sequences and varied the presence of prosodic cues. The results suggest that mere distributional information does not suffice for successful learning. Prosodic cues marking the boundaries of the major relevant units, however, can lead to learning success. Thus, our data are consistent with the hypothesis that center-embedded syntactic structures can be learned in artificial grammar tasks if language-like acoustic cues are provided.
The frontal cortex mediates cognitive control and motivation to shape human behavior. It is generally observed that medial frontal areas are involved in motivational aspects of behavior, whereas lateral frontal regions are involved in cognitive control. Recent models of cognitive control suggest a rostro-caudal gradient in lateral frontal regions, such that progressively more rostral (anterior) regions process more complex aspects of cognitive control. How motivation influences such a control hierarchy is still under debate. Although some researchers argue that both systems work in parallel, others argue in favor of an interaction between motivation and cognitive control. In the latter case it is yet unclear how motivation would affect the different levels of the control hierarchy. This was investigated in the present functional MRI study applying different levels of cognitive control under different motivational states (low vs high reward anticipation). Three levels of cognitive control were tested by varying rule complexity: stimulus-response mapping (low-level), flexible task updating (mid-level), and sustained cue-task associations (high-level). We found an interaction between levels of cognitive control and motivation in medial and lateral frontal subregions. Specifically, flexible updating (mid-level of control) showed the strongest beneficial effect of reward and only this level exhibited functional coupling between dopamine-rich midbrain regions and the lateral frontal cortex. These findings suggest that motivation differentially affects the levels of a control hierarchy, influencing recruitment of frontal cortical control regions depending on specific task demands.Key words: cognitive control; control hierarchy; fMRI; lateral frontal cortex; motivation; reward IntroductionOur decisions are driven by multiple factors, such as attention, motivation, emotion, and cognitive control. However, how these factors interact is not well understood. In particular, it is still under debate whether motivation and cognitive control operate independently to influence decision making (Kouneiher et al., 2009) or if these cognitive systems can also interact (Dreisbach and Goschke, 2004; Pessoa and Engelmann, 2010;Aarts et al., 2011).The lateral frontal cortex is a critical node in brain networks involved in cognitive control (Miller and Cohen, 2001; Fuster, 2004;Petrides, 2005;Duncan, 2010). Recent models of goaldirected behavior propose a hierarchical organization of the lateral frontal cortex as a function of different levels of cognitive control (Koechlin et al., 2003; Fuster, 2004; Badre and D'Esposito, 2009; Bahlmann et al., 2014). These models suggest that lower levels of cognitive control, such as choosing a specific motor response, are integrated within higher levels of cognitive control that guide behavior over longer time lags and at more complex levels of action contingency. Importantly, different levels of cognitive control are proposed to be represented in lateral frontal cortex along a rostro-caudal gradi...
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.