We studied activation magnitudes in core, belt, and parabelt auditory cortex in adults with normal hearing (NH) and unilateral hearing loss (UHL) using an interrupted, single-event design and monaural stimulation with random spectrographic sounds. NH patients had one ear blocked and received stimulation on the side matching the intact ear in UHL. The objective was to determine whether the side of deafness affected lateralization and magnitude of evoked blood oxygen level-dependent responses across different auditory cortical fields (ACFs). Regardless of ear of stimulation, NH showed larger contralateral responses in several ACFs. With right ear stimulation in UHL, ipsilateral responses were larger compared to NH in core and belt ACFs, indicating neuroplasticity in the right hemisphere. With left ear stimulation in UHL, only posterior core ACFs showed larger ipsilateral responses, suggesting that most ACFs in the left hemisphere had greater resilience against reduced crossed inputs from a deafferented right ear. Parabelt regions located posterolateral to core and belt auditory cortex showed reduced activation in UHL compared to NH irrespective of RE/LE stimulation and lateralization of inputs. Thus, the effect in UHL compared to NH differed by ACF and ear of deafness.
We examined cortical activity in early blind during word recognition memory. Nine participants were blind at birth and one by 1.5 yrs. In an event-related design, we studied blood oxygen level-dependent responses to studied (“old”) compared to novel (“new”) words. Presentation mode was in Braille or spoken. Responses were larger for identified “new” words read with Braille in bilateral lower and higher tier visual areas and primary somatosensory cortex. Responses to spoken “new” words were larger in bilateral primary and accessory auditory cortex. Auditory cortex was unresponsive to Braille words and occipital cortex responded to spoken words but not differentially with “old”/“new” recognition. Left dorsolateral prefrontal cortex had larger responses to “old” words only with Braille. Larger occipital cortex responses to “new” Braille words suggested verbal memory based on the mechanism of recollection. A previous report in sighted noted larger responses for “new” words studied in association with pictures that created a distinctiveness heuristic source factor which enhanced recollection during remembering. Prior behavioral studies in early blind noted an exceptional ability to recall words. Utilization of this skill by participants in the current study possibly engendered recollection that augmented remembering “old” words. A larger response when identifying “new” words possibly resulted from exhaustive recollecting the sensory properties of “old” words in modality appropriate sensory cortices. The uniqueness of a memory role for occipital cortex is in its cross-modal responses to coding tactile properties of Braille. The latter possibly reflects a “sensory echo” that aids recollection.
The present fMRI study examined cortical activity to repeated vibrotactile sequences in 11 early blind and 11 sighted participants. All participants performed with >90% accuracy and showed practice induced improvement with faster reaction times in identifying matched and unmatched vibrotactile sequences. In blind only, occipital/temporal and parietal/somatosensory cortices showed practice induced reductions in positive BOLD amplitudes that possibly reflected repetition induced learning effects. The significant findings in occipital cortex of blind indicated that perceptual processing of tactile inputs in visually deprived cortex is dynamic as response amplitudes changed with practice. Thus, stimulus processing became more efficient. It was hypothesized that the changes in occipital cortex of blind reflected life-long skill in processing somatosensory inputs. Both groups showed activity reductions with practice in mid/posterior ventrolateral prefrontal cortex. These activity reductions suggested common stimulus-response learning associations for vibrotactile sequences in mid/posterior ventrolateral prefrontal cortex.
Those with profound sensorineural hearing loss from single sided deafness (SSD) generally experience greater cognitive effort and fatigue in adverse sound environments. We studied cases with right ear, SSD compared to normal hearing (NH) individuals. SSD cases were significantly less correct in naming last words in spectrally degraded 8- and 16-band vocoded sentences, despite high semantic predictability. Group differences were not significant for less intelligible 4-band sentences, irrespective of predictability. SSD also had diminished BOLD percent signal changes to these same sentences in left hemisphere (LH) cortical regions of early auditory, association auditory, inferior frontal, premotor, inferior parietal, dorsolateral prefrontal, posterior cingulate, temporal-parietal-occipital junction, and posterior opercular. Cortical regions with lower amplitude responses in SSD than NH were mostly components of a LH language network, previously noted as concerned with speech recognition. Recorded BOLD signal magnitudes were averages from all vertices within predefined parcels from these cortex regions. Parcels from different regions in SSD showed significantly larger signal magnitudes to sentences of greater intelligibility (e.g., 8- or 16- vs. 4-band) in all except early auditory and posterior cingulate cortex. Significantly lower response magnitudes occurred in SSD than NH in regions prior studies found responsible for phonetics and phonology of speech, cognitive extraction of meaning, controlled retrieval of word meaning, and semantics. The findings suggested reduced activation of a LH fronto-temporo-parietal network in SSD contributed to difficulty processing speech for word meaning and sentence semantics. Effortful listening experienced by SSD might reflect diminished activation to degraded speech in the affected LH language network parcels. SSD showed no compensatory activity in matched right hemisphere parcels.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.