In humans, the ability to reason about mathematical quantities depends on a frontoparietal network that includes the intraparietal sulcus (IPS). How do nature and nurture give rise to the neurobiology of numerical cognition? We asked how visual experience shapes the neural basis of numerical thinking by studying numerical cognition in congenitally blind individuals. Blind (n = 17) and blindfolded sighted (n = 19) participants solved math equations that varied in difficulty (e.g., 27 − 12 = x vs. 7 − 2 = x), and performed a control sentence comprehension task while undergoing fMRI. Whole-cortex analyses revealed that in both blind and sighted participants, the IPS and dorsolateral prefrontal cortices were more active during the math task than the language task, and activity in the IPS increased parametrically with equation difficulty. Thus, the classic frontoparietal number network is preserved in the total absence of visual experience. However, surprisingly, blind but not sighted individuals additionally recruited a subset of early visual areas during symbolic math calculation. The functional profile of these "visual" regions was identical to that of the IPS in blind but not sighted individuals. Furthermore, in blindness, number-responsive visual cortices exhibited increased functional connectivity with prefrontal and IPS regions that process numbers. We conclude that the frontoparietal number network develops independently of visual experience. In blindness, this number network colonizes parts of deafferented visual cortex. These results suggest that human cortex is highly functionally flexible early in life, and point to frontoparietal input as a mechanism of cross-modal plasticity in blindness.plasticity | blindness | number | development | vision N umerical reasoning pervades modern human culture. We readily represent quantity, whether thinking about apples, hours, people, or ideas. It has been suggested that this competence is rooted in a primitive nonsymbolic system of numerical representation that is shared among adults of diverse cultures, as well as with preverbal infants and nonhuman animals (1, 2). This nonsymbolic system allows these populations to estimate numbers of visual or auditory items and to compute over these quantities. For example, infants and monkeys can detect which of two arrays contains more items, and can add and subtract approximate quantities (1-4). The nonverbal, nonsymbolic system underlying this performance represents number in an inherently approximate way (5). However, numerate humans also have the unique ability to reason about quantities precisely using an acquired system of number symbols (5).Reasoning about approximate and exact number depends on a frontoparietal network, a key node of which is the intraparietal sulcus (IPS) (6). The IPS is active when participants estimate the number of items in a nonsymbolic display as well as when they solve symbolic math problems (e.g., 23 − 19 = x), with more IPS activity during hard math problems than easier ones (6, 7). Temporary dea...
Human cortex is comprised of specialized networks that support functions, such as visual motion perception and language processing. How do genes and experience contribute to this specialization? Studies of plasticity offer unique insights into this question. In congenitally blind individuals, "visual" cortex responds to auditory and tactile stimuli. Remarkably, recent evidence suggests that occipital areas participate in language processing. We asked whether in blindness, occipital cortices: (1) develop domain-specific responses to language and (2) respond to a highly specialized aspect of language-syntactic movement. Nineteen congenitally blind and 18 sighted participants took part in two fMRI experiments. We report that in congenitally blind individuals, but not in sighted controls, "visual" cortex is more active during sentence comprehension than during a sequence memory task with nonwords, or a symbolic math task. This suggests that areas of occipital cortex become selective for language, relative to other similar higher-cognitive tasks. Crucially, we find that these occipital areas respond more to sentences with syntactic movement but do not respond to the difficulty of math equations. We conclude that regions within the visual cortex of blind adults are involved in syntactic processing. Our findings suggest that the cognitive function of human cortical areas is largely determined by input during development.
Learning to read causes the development of a letter- and word-selective region known as the visual word form area (VWFA) within the human ventral visual object stream. Why does a reading-selective region develop at this anatomical location? According to one hypothesis, the VWFA develops at the nexus of visual inputs from retinotopic cortices and linguistic input from the frontotemporal language network because reading involves extracting linguistic information from visual symbols. Surprisingly, the anatomical location of the VWFA is also active when blind individuals read Braille by touch, suggesting that vision is not required for the development of the VWFA. In this study, we tested the alternative prediction that VWFA development is in fact influenced by visual experience. We predicted that in the absence of vision, the "VWFA" is incorporated into the frontotemporal language network and participates in high-level language processing. Congenitally blind ( = 10, 9 female, 1 male) and sighted control ( = 15, 9 female, 6 male), male and female participants each took part in two functional magnetic resonance imaging experiments: (1) word reading (Braille for blind and print for sighted participants), and (2) listening to spoken sentences of different grammatical complexity (both groups). We find that in blind, but not sighted participants, the anatomical location of the VWFA responds both to written words and to the grammatical complexity of spoken sentences. This suggests that in blindness, this region takes on high-level linguistic functions, becoming less selective for reading. More generally, the current findings suggest that experience during development has a major effect on functional specialization in the human cortex. The visual word form area (VWFA) is a region in the human cortex that becomes specialized for the recognition of written letters and words. Why does this particular brain region become specialized for reading? We tested the hypothesis that the VWFA develops within the ventral visual stream because reading involves extracting linguistic information from visual symbols. Consistent with this hypothesis, we find that in congenitally blind Braille readers, but not sighted readers of print, the VWFA region is active during grammatical processing of spoken sentences. These results suggest that visual experience contributes to VWFA specialization, and that different neural implementations of reading are possible.
Abstract■ Language processing depends on a left-lateralized network of frontotemporal cortical regions. This network is remarkably consistent across individuals and cultures. However, there is also evidence that developmental factors, such as delayed exposure to language, can modify this network. Recently, it has been found that, in congenitally blind individuals, the typical frontotemporal language network expands to include parts of "visual" cortices. Here, we report that blindness is also associated with reduced left lateralization in frontotemporal language areas. We analyzed fMRI data from two samples of congenitally blind adults (n = 19 and n = 13) and one sample of congenitally blind children (n = 20). Laterality indices were computed for sentence comprehension relative to three different control conditions: solving math equations (Experiment 1), a memory task with nonwords (Experiment 2), and a "does this come next?" task with music (Experiment 3). Across experiments and participant samples, the frontotemporal language network was less left-lateralized in congenitally blind than in sighted individuals. Reduction in left lateralization was not related to Braille reading ability or amount of occipital plasticity. Notably, we observed a positive correlation between the lateralization of frontotemporal cortex and that of language-responsive occipital areas in blind individuals. Blind individuals with right-lateralized language responses in frontotemporal cortices also had right-lateralized occipital responses to language. Together, these results reveal a modified neurobiology of language in blindness. Our findings suggest that, despite its usual consistency across people, the neurobiology of language can be modified by nonlinguistic experiences. ■
Studies of sensory loss are a model for understanding the functional flexibility of human cortex. In congenital blindness, subsets of visual cortex are recruited during higher-cognitive tasks, such as language and math tasks. Is such dramatic functional repurposing possible throughout the lifespan or restricted to sensitive periods in development? We compared visual cortex function in individuals who lost their vision as adults (after age 17) to congenitally blind and sighted blindfolded adults. Participants took part in resting-state and task-based fMRI scans during which they solved math equations of varying difficulty and judged the meanings of sentences. Blindness at any age caused “visual” cortices to synchronize with specific frontoparietal networks at rest. However, in task-based data, visual cortices showed regional specialization for math and language and load-dependent activity only in congenital blindness. Thus, despite the presence of long-range functional connectivity, cognitive repurposing of human cortex is limited by sensitive periods.
The neural basis of reading is highly consistent across many languages and scripts. Are there alternative neural routes to reading? How does the sensory modality of symbols (tactile vs. visual) influence their neural representations? We examined these questions by comparing reading of visual print (sighted group, n = 19) and tactile Braille (congenitally blind group, n = 19). Blind and sighted readers were presented with written (words, consonant strings, non-letter shapes) and spoken stimuli (words, backward speech) that varied in word-likeness. Consistent with prior work, the ventral occipitotemporal cortex (vOTC) was active during Braille and visual reading. A posterior/anterior vOTC word-form gradient was observed only in sighted readers with more anterior regions preferring larger orthographic units (words). No such gradient was observed in blind readers. Consistent with connectivity predictions, in blind compared to sighted readers, posterior parietal cortices were recruited to a greater degree and contained word-preferring patches. Lateralization of Braille in blind readers was predicted by laterality of spoken language and reading hand. The effect of spoken language increased along a cortical hierarchy, whereas effect of reading hand waned. These results suggested that the neural basis of reading is influenced by symbol modality and spoken language and support connectivity-based views of cortical function.
Developmental change in children's number-line estimation has been thought to reveal a categorical logarithmic-to-linear shift in mental representations of number. Some have claimed that the broad and rapid change in estimation patterns that occurs with corrective feedback provides strong evidence for this shift. However, quantitative models of proportion judgment may provide a better account of children's estimation patterns while also predicting broad and rapid change following feedback. Here we test the hypothesis that local corrective feedback provides children with additional reference points, rather than catalyzing a shift to a different mental representation of number. We tested 117 children from several second-grade classrooms in a number-line feedback study. Data indicate that the proportion-judgment framework accounts for individual differences in estimation patterns, and that the effects of feedback are consistent with the unique quantitative predictions of the framework. They do not provide evidence supporting the representational shift hypothesis or, more broadly, for the proposal that cognitive change can occur rapidly at the level of entire mental representations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.