The entrainment of slow rhythmic auditory cortical activity to the temporal regularities in speech is considered to be a central mechanism underlying auditory perception. Previous work has shown that entrainment is reduced when the quality of the acoustic input is degraded, but has also linked rhythmic activity at similar time scales to the encoding of temporal expectations. To understand these bottom-up and top-down contributions to rhythmic entrainment, we manipulated the temporal predictive structure of speech by parametrically altering the distribution of pauses between syllables or words, thereby rendering the local speech rate irregular while preserving intelligibility and the envelope fluctuations of the acoustic signal. Recording EEG activity in human participants, we found that this manipulation did not alter neural processes reflecting the encoding of individual sound transients, such as evoked potentials. However, the manipulation significantly reduced the fidelity of auditory delta (but not theta) band entrainment to the speech envelope. It also reduced left frontal alpha power and this alpha reduction was predictive of the reduced delta entrainment across participants. Our results show that rhythmic auditory entrainment in delta and theta bands reflect functionally distinct processes. Furthermore, they reveal that delta entrainment is under top-down control and likely reflects prefrontal processes that are sensitive to acoustical regularities rather than the bottom-up encoding of acoustic features.
The qualities of perception depend not only on the sensory inputs but also on the brain state before stimulus presentation. Although the collective evidence from neuroimaging studies for a relation between prestimulus state and perception is strong, the interpretation in the context of sensory computations or decision processes has remained difficult. In the auditory system, for example, previous studies have reported a wide range of effects in terms of the perceptually relevant frequency bands and state parameters (phase/ power). To dissociate influences of state on earlier sensory representations and higher-level decision processes, we collected behavioral and EEG data in human participants performing two auditory discrimination tasks relying on distinct acoustic features. Using single-trial decoding, we quantified the relation between prestimulus activity, relevant sensory evidence, and choice in different taskrelevant EEG components. Within auditory networks, we found that phase had no direct influence on choice, whereas power in taskspecific frequency bands affected the encoding of sensory evidence. Within later-activated frontoparietal regions, theta and alpha phase had a direct influence on choice, without involving sensory evidence. These results delineate two consistent mechanisms by which prestimulus activity shapes perception. However, the timescales of the relevant neural activity depend on the specific brain regions engaged by the respective task.perception | oscillatory brain activity | EEG | single-trial decoding | prestimulus effects S ensory percepts depend not only on the environmental inputs but also on the internal brain state before stimulus presentation (1). Many studies have shown that the accuracy and speed of sensory performance change with the power and timing (phase) of rhythmic activity during a prestimulus period (2, 3). Studies in the auditory system, for example, have demonstrated that performance in detecting sounds and gaps in noise, or the discrimination of lexical stimuli, varies with the power and phase of rhythmic activity between about 1 and 12 Hz (4-9).Although the collective evidence makes a strong case that prestimulus state shapes the processing and perceptual consequences of sensory inputs, the functional interpretation of these findings in the context of specific sensory computations or higher cognitive processes has remained difficult (7, 10, 11). Electrophysiological studies in animals have described the state dependency of firing rates relative to cortical oscillations (12-15). Hence, it is tempting to interpret the reported prestimulus effects in neuroimaging studies as direct evidence for a link between the neural gain of early sensory cortices and perception. However, this is difficult for two reasons. First, previous studies have used different behavioral protocols (detection and discrimination) and stimuli (tones in silence or noise, gaps in noise, or speech), and each has implied different frequency bands and state parameters as relevant (from 1 to 12 Hz, repo...
Sensory discriminations, such as judgements about visual motion, often benefit from multisensory evidence. Despite many reports of enhanced brain activity during multisensory conditions, it remains unclear which dynamic processes implement the multisensory benefit for an upcoming decision in the human brain. Specifically, it remains difficult to attribute perceptual benefits to specific processes, such as early sensory encoding, the transformation of sensory representations into a motor response, or to more unspecific processes such as attention. We combined an audio-visual motion discrimination task with the single-trial mapping of dynamic sensory representations in EEG activity to localize when and where multisensory congruency facilitates perceptual accuracy. Our results show that a congruent sound facilitates the encoding of motion direction in occipital sensory - as opposed to parieto-frontal - cortices, and facilitates later - as opposed to early (i.e. below 100 ms) - sensory activations. This multisensory enhancement was visible as an earlier rise of motion-sensitive activity in middle-occipital regions about 350 ms from stimulus onset, which reflected the better discriminability of motion direction from brain activity and correlated with the perceptual benefit provided by congruent multisensory information. This supports a hierarchical model of multisensory integration in which the enhancement of relevant sensory cortical representations is transformed into a more accurate choice.
As we get older, perception in cluttered environments becomes increasingly difficult as a result of changes in peripheral and central neural processes. Given the aging society, it is important to understand the neural mechanisms constraining perception in the elderly. In young participants, the state of rhythmic brain activity prior to a stimulus has been shown to modulate the neural encoding and perceptual impact of this stimulus – yet it remains unclear whether, and if so, how, the perceptual relevance of pre-stimulus activity changes with age. Using the auditory system as a model, we recorded EEG activity during a frequency discrimination task from younger and older human listeners. By combining single-trial EEG decoding with linear modelling we demonstrate consistent statistical relations between pre-stimulus power and the encoding of sensory evidence in short-latency EEG components, and more variable relations between pre-stimulus phase and subjects’ decisions in longer-latency components. At the same time, we observed a significant slowing of auditory evoked responses and a flattening of the overall EEG frequency spectrum in the older listeners. Our results point to mechanistically consistent relations between rhythmic brain activity and sensory encoding that emerge despite changes in neural response latencies and the relative amplitude of rhythmic brain activity with age.
A well-known effect in multisensory perception is that congruent information received by different senses usually leads to faster and more accurate responses. Less well understood are trial-by-trial interactions, whereby the multisensory composition of stimuli experienced during previous trials shapes performance during a subsequent trial. We here exploit the analogy of multisensory paradigms with classical flanker tasks to investigate the neural correlates underlying trial-by-trial interactions of multisensory congruency. Studying an audio-visual motion task, we demonstrate that congruency benefits for accuracy and reaction times are reduced following an audio-visual incongruent compared to a congruent preceding trial. Using single trial analysis of motion-sensitive EEG components we then localize current-trial and serial interaction effects within distinct brain regions: while the multisensory congruency experienced during the current trial influences the encoding of task-relevant information in sensory-specific brain regions, the serial interaction arises from task-relevant processes within the inferior frontal lobe. These results highlight parallels between multisensory paradigms and classical flanker tasks and demonstrate a role of amodal association cortices in shaping perception based on the history of multisensory congruency.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.