It is well known that the timing of brief static sounds can alter different aspects of visual motion perception. For instance, previous studies have shown that time intervals demarcated by brief sounds can modulate perceived visual speed such that apparent motions with short auditory time intervals are typically perceived as faster than those with long time intervals. Yet, little is known about the principles and cortical processes underlying such effects of auditory timing. Using a speed judgment paradigm combined with EEG recording, we aimed to identify when and where in the cortex auditory timing takes place for motion processing. Our results indicated significant effects of auditory timing over the medial parieto-occipital and parietal, right centro-parietal, and frontal scalp sites. In addition, these effects were not restricted to a single ERP component and we observed both significant changes in early and late components. Therefore, our findings here suggest that auditory timing may take place at both early and late stages of motion processing and its influences on motion perception may be the outcome of the dynamic interplay between different cortical regions. Together with accumulating evidence, these findings also support the notion that audiovisual integration is a multistage process and it may be achieved through more diversified processes than previously thought.
Accumulating evidence suggests that the timing of brief stationary sounds affects visual motion perception. Recent studies have shown that auditory time interval can alter apparent motion perception not only through concurrent stimulation but also through brief adaptation. The adaptation after-effects for auditory time intervals was found to be similar to those for visual time intervals, suggesting the involvement of a central timing mechanism. To understand the nature of cortical processes underlying such after-effects, we adapted observers to different time intervals using either brief sounds or visual flashes and examined the evoked activity to the subsequently presented visual apparent motion. Both auditory and visual time interval adaptation led to significant changes in the ERPs elicited by the apparent motion. However, the changes induced by each modality were in the opposite direction. Also, they mainly occurred in different time windows and clustered over distinct scalp sites. The effects of auditory time interval adaptation were centred over parietal and parieto-central electrodes while the visual adaptation effects were mostly over occipital and parieto-occipital regions. Moreover, the changes were much more salient when sounds were used during the adaptation phase. Taken together, our findings within the context of visual motion point to auditory dominance in the temporal domain and highlight the distinct nature of the sensory processes involved in auditory and visual time interval adaptation.
Adaptation is essential to interact with a dynamic and changing environment, and can be observed on different timescales. Previous studies on a motion paradigm called dynamic motion aftereffect (dMAE) showed that neural adaptation can establish even in very short timescales. However, the neural mechanisms underlying such rapid form of neural plasticity is still debated. In the present study, short-and long-term forms of neural plasticity were investigated using dynamic motion aftereffect combined with EEG (Electroencephalogram). Participants were adapted to directional drifting gratings for either short (640 msec) or long (6.4 sec) durations. Both adaptation durations led to motion aftereffects on the perceived direction of a dynamic and directionally ambiguous test pattern, but the long adaptation produced stronger dMAE. In line with behavioral results, we found robust changes in the event-related potentials elicited by the dynamic test pattern within 64 e112 msec time range. These changes were mainly clustered over occipital and parietooccipital scalp sites. Within this time range, the aftereffects induced by long adaptation were stronger than those by short adaptation. Moreover, the aftereffects by each adaptation duration were in the opposite direction. Overall, these EEG findings suggest that dMAEs reflect changes in cortical areas mediating low-and mid-level visual motion processing. They further provide evidence that short-and long-term forms of motion adaptation lead to distinct changes in neural activity, and hence support the view that adaptation is an active time-dependent process which involves different neural mechanisms.
The integration of information from different senses is central to our perception of the external world. Audiovisual interactions have been particularly well studied in this context and various illusions have been developed to demonstrate strong influences of these interactions on the final percept. Using audiovisual paradigms, previous studies have shown that even task-irrelevant information provided by a secondary modality can change the detection and discrimination of a primary target. These modulations have been found to be significantly dependent on the relative timing between auditory and visual stimuli. Although these interactions in time have been commonly reported, we have still limited understanding of the relationship between the modulations of event-related potentials (ERPs) and final behavioral performance. Here, we aimed to shed light on this important issue by using a speeded discrimination paradigm combined with electroencephalogram (EEG). During the experimental sessions, the timing between an auditory click and a visual flash was varied over a wide range of stimulus onset asynchronies and observers were engaged in speeded discrimination of flash location. Behavioral reaction times were significantly changed by click timing. Furthermore, the modulations of evoked activities over medial parietal/parieto-occipital electrodes were associated with this effect. These modulations were within the 126-176 ms time range and more importantly, they were also correlated with the changes in reaction times. These results provide an important functional link between audiovisual interactions at early stages of sensory processing and reaction times. Together with previous research, they further suggest that early crossmodal interactions play a critical role in perceptual performance.
Memories benefit from sleep, and sleep loss immediately following learning has a negative impact on subsequent memory storage. Several prominent hypotheses ascribe a central role to hippocampal sharp-wave ripples (SWRs), and the concurrent reactivation and replay of neuronal patterns from waking experience, in the offline memory consolidation process that occurs during sleep. However, little is known about how SWRs, reactivation, and replay are affected when animals are subjected to sleep deprivation. We performed long duration (~12 h), high-density silicon probe recordings from rat hippocampal CA1 neurons, in animals that were either sleeping or sleep deprived following exposure to a novel maze environment. We found that SWRs showed a sustained rate of activity during sleep deprivation, similar to or higher than in natural sleep, but with decreased amplitudes for the sharp-waves combined with higher frequencies for the ripples. Furthermore, while hippocampal pyramidal cells showed a log-normal distribution of firing rates during sleep, these distributions were negatively skewed with a higher mean firing rate in both pyramidal cells and interneurons during sleep deprivation. During SWRs, however, firing rates were remarkably similar between both groups. Despite the abundant quantity of SWRs and the robust firing activity during these events in both groups, we found that reactivation of neurons was either completely abolished or significantly diminished during sleep deprivation compared to sleep. Interestingly, reactivation partially rebounded upon recovery sleep, but failed to reach the levels characteristic of natural sleep. Similarly, the number of replays were significantly lower during sleep deprivation and recovery sleep compared to natural sleep. These results provide a network-level account for the negative impact of sleep loss on hippocampal function and demonstrate that sleep loss impacts memory storage by causing a dissociation between the amount of SWRs and the replays and reactivations that take place during these events.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.