Although non-invasive techniques provide functional activation maps at ever-growing spatio-temporal precision, invasive recordings offer a unique opportunity for direct investigations of the fine-scale properties of neural mechanisms in focal neuronal populations. In this review we provide an overview of the field of intracranial Electroencephalography (iEEG) and discuss its strengths and limitations and its relationship to non-invasive brain mapping techniques. We discuss the characteristics of invasive data acquired from implanted epilepsy patients using stereotactic-electroencephalography (SEEG) and electrocorticography (ECoG) and the use of spectral analysis to reveal task-related modulations in multiple frequency components. Increasing evidence suggests that gamma-band activity (>40 Hz) might be a particularly efficient index for functional mapping. Moreover, the detection of high gamma activity may play a crucial role in bridging the gap between electrophysiology and functional imaging studies as well as in linking animal and human data. The present review also describes recent advances in real-time invasive detection of oscillatory modulations (including gamma activity) in humans. Furthermore, the implications of intracerebral findings on future non-invasive studies are discussed.
It is generally accepted that during periods of attention to specific stimuli there are changes in the neural activity of central auditory structures; however, it is controversial whether attention can modulate auditory responses at the cochlear level. Several studies performed in animals as well as in humans have attempted to find a modulation of cochlear responses during visual attention with contradictory results. Here, we have appraised cochlear sensitivity in behaving chinchillas by measuring, with a chronically implanted roundwindow electrode, sound-evoked auditory-nerve compound action potentials and cochlear microphonics, a measure of outer hair cell function, during selective attention to visual stimuli. Chinchillas were trained in a visual discrimination or in an auditory frequency discrimination two-choice task. We found a significant decrease of cochlear sensitivity during the period of attention to visual stimuli in the animals performing the visual discrimination task, but not in those performing the auditory task, demonstrating that this physiological effect is related to selective attention to visual stimuli rather than to an increment in arousal level. Furthermore, the magnitude of the cochlear-sensitivity reductions increased in sessions performed with shorter target-light durations (4 -0.5 s), suggesting that this effect is stronger for higher attentional demands of the task. These results demonstrate that afferent auditory activity is modulated by selective attention as early as at sensory transduction, possibly through activation of olivocochlear efferent fibers.
As you might experience it while reading this sentence, silent reading often involves an imagery speech component: we can hear our own "inner voice" pronouncing words mentally. Recent functional magnetic resonance imaging studies have associated that component with increased metabolic activity in the auditory cortex, including voice-selective areas. It remains to be determined, however, whether this activation arises automatically from early bottom-up visual inputs or whether it depends on late top-down control processes modulated by task demands. To answer this question, we collaborated with four epileptic human patients recorded with intracranial electrodes in the auditory cortex for therapeutic purposes, and measured high-frequency (50 -150 Hz) "gamma" activity as a proxy of population level spiking activity. Temporal voice-selective areas (TVAs) were identified with an auditory localizer task and monitored as participants viewed words flashed on screen. We compared neural responses depending on whether words were attended or ignored and found a significant increase of neural activity in response to words, strongly enhanced by attention. In one of the patients, we could record that response at 800 ms in TVAs, but also at 700 ms in the primary auditory cortex and at 300 ms in the ventral occipital temporal cortex. Furthermore, single-trial analysis revealed a considerable jitter between activation peaks in visual and auditory cortices. Altogether, our results demonstrate that the multimodal mental experience of reading is in fact a heterogeneous complex of asynchronous neural responses, and that auditory and visual modalities often process distinct temporal frames of our environment at the same time.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.