Local field potentials (LFPs) reflect subthreshold integrative processes that complement spike train measures. However, little is yet known about the differences between how LFPs and spikes encode rich naturalistic sensory stimuli. We addressed this question by recording LFPs and spikes from the primary visual cortex of anesthetized macaques while presenting a color movie. We then determined how the power of LFPs and spikes at different frequencies represents the visual features in the movie. We found that the most informative LFP frequency ranges were 1-8 and 60 -100 Hz. LFPs in the range of 12-40 Hz carried little information about the stimulus, and may primarily reflect neuromodulatory inputs. Spike power was informative only at frequencies Ͻ12 Hz. We further quantified "signal correlations" (correlations in the trial-averaged power response to different stimuli) and "noise correlations" (trial-by-trial correlations in the fluctuations around the average) of LFPs and spikes recorded from the same electrode. We found positive signal correlation between high-gamma LFPs (60 -100 Hz) and spikes, as well as strong positive signal correlation within high-gamma LFPs, suggesting that high-gamma LFPs and spikes are generated within the same network. LFPs Ͻ24 Hz shared strong positive noise correlations, indicating that they are influenced by a common source, such as a diffuse neuromodulatory input. LFPs Ͻ40 Hz showed very little signal and noise correlations with LFPs Ͼ40 Hz and with spikes, suggesting that low-frequency LFPs reflect neural processes that in natural conditions are fully decoupled from those giving rise to spikes and to high-gamma LFPs.
Studies analyzing sensory cortical processing or trying to decode brain activity often rely on a combination of different electrophysiological signals, such as local field potentials (LFPs) and spiking activity. Understanding the relation between these signals and sensory stimuli and between different components of these signals is hence of great interest. We here provide an analysis of LFPs and spiking activity recorded from visual and auditory cortex during stimulation with natural stimuli. In particular, we focus on the time scales on which different components of these signals are informative about the stimulus, and on the dependencies between different components of these signals. Addressing the first question, we find that stimulus information in low frequency bands (<12 Hz) is high, regardless of whether their energy is computed at the scale of milliseconds or seconds. Stimulus information in higher bands (>50 Hz), in contrast, is scale dependent, and is larger when the energy is averaged over several hundreds of milliseconds. Indeed, combined analysis of signal reliability and information revealed that the energy of slow LFP fluctuations is well related to the stimulus even when considering individual or few cycles, while the energy of fast LFP oscillations carries information only when averaged over many cycles. Addressing the second question, we find that stimulus information in different LFP bands, and in different LFP bands and spiking activity, is largely independent regardless of time scale or sensory system. Taken together, these findings suggest that different LFP bands represent dynamic natural stimuli on distinct time scales and together provide a potentially rich source of information for sensory processing or decoding brain activity.Electronic supplementary materialThe online version of this article (doi:10.1007/s10827-010-0230-y) contains supplementary material, which is available to authorized users.
The Farwell and Donchin matrix speller is well known as one of the highest performing brain-computer interfaces (BCIs) currently available. However, its use of visual stimulation limits its applicability to users with normal eyesight. Alternative BCI spelling systems which rely on non-visual stimulation, e.g. auditory or tactile, tend to perform much more poorly and/or can be very difficult to use. In this paper we present a novel extension of the matrix speller, based on flipping the letter matrix, which allows us to use the same interface for visual, auditory or simultaneous visual and auditory stimuli. In this way we aim to allow users to utilize the best available input modality for their situation, that is use visual + auditory for best performance and move smoothly to purely auditory when necessary, e.g. when disease causes the user's eyesight to deteriorate. Our results on seven healthy subjects demonstrate the effectiveness of this approach, with our modified visual + auditory stimulation slightly out-performing the classic matrix speller. The purely auditory system performance was lower than for visual stimulation, but comparable to other auditory BCI systems.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.