The McGurk illusion is one of the most famous illustrations of cross-modal integration in human perception. It has been often used as a proxy of audiovisual (AV) integration and to infer the properties of the integration process in natural (congruent) AV conditions. Nonetheless, a blatant difference between McGurk stimuli and natural, congruent, AV speech is the conflict between the auditory and the visual information in the former. Here, we hypothesized that McGurk stimuli (and any AV incongruency) engage brain responses similar to those found in more general cases of perceptual conflict (e.g., Stroop), and propose that the McGurk illusion arises as a result of the resolution of such conflict. We used electroencephalography to measure variations in the power of theta, a well-known marker of the brain response to conflict. The results showed that perception of AV McGurk stimuli, just like AV incongruence in general, induces an increase in activity in the theta band. This response was similar to that evoked by Stroop stimuli, as measured in the same participants. This finding suggests that the McGurk illusion is mediated by general-purpose conflict mechanisms, and calls for caution in generalizing findings obtained using the McGurk illusion, to the general case of multisensory integration.
Speakers often accompany speech with spontaneous beat gestures in natural spoken communication. These gestures are usually aligned with lexical stress and can modulate the saliency of their affiliate words. Here we addressed the consequences of beat gestures on the neural correlates of speech perception. Previous studies have highlighted the role played by theta oscillations in temporal prediction of speech. We hypothesized that the sight of beat gestures may influence ongoing low-frequency neural oscillations around the onset of the corresponding words. Electroencephalographic (EEG) recordings were acquired while participants watched a continuous, naturally recorded discourse. The phase-locking value (PLV) at word onset was calculated from the EEG from pairs of identical words that had been pronounced with and without a concurrent beat gesture in the discourse. We observed an increase in PLV in the 5-6 Hz theta range as well as a desynchronization in the 8-10 Hz alpha band around the onset of words preceded by a beat gesture. These findings suggest that beats help tune low-frequency oscillatory activity at relevant moments during natural speech perception, providing a new insight of how speech and paralinguistic information are integrated.
Electrical brain oscillations reflect fluctuations in neural excitability. Fluctuations in the alpha band (α,(8)(9)(10)(11)(12) in the occipito-parietal cortex are thought to regulate sensory responses, leading to cyclic variations in visual perception. Inspired by this theory, some past and recent studies have addressed the relationship between α-phase from extra-cranial EEG and behavioural responses to visual stimuli in humans. The latest studies have used offline approaches to confirm α-gated cyclic patterns.However, a particularly relevant implication is the possibility to use this principle online for realtime neurotechnology, whereby stimuli are time-locked to specific α-phases leading to predictable outcomes in performance. Here we aimed at providing a proof-of-concept for such real-time neurotechnology. Participants performed a speeded response task to visual targets that were presented upon a real-time estimation of the α-phase via an EEG closed-loop brain-computer interface (BCI). We predicted, according to the theory, a modulation of reaction times (RTs) along the α-cycle. Our BCI system achieved reliable trial-to-trial phase-locking of stimuli to the phase of individual occipito-parietal α-oscillations. Yet, the behavioural results did not support a consistent relation between RTs and the phase of the α-cycle neither at group nor single participant levels. We must conclude that although the α-phase might play a role in perceptual decisions from a theoretical perspective, its impact on EEG-based BCI application appears negligible.
The interactions between the senses are essential for cognitive functions such as perception, attention, and action planning. Past research helped understanding of multisensory processes in the laboratory. Yet, the efforts to extrapolate these findings to the real-world are scarce. Extrapolation to real-world contexts is important for practical and theoretical reasons. Multisensory phenomena might be expressed differently in real-world settings compared to simpler laboratory situations. Some effects might become stronger, others may disappear, and new outcomes could be discovered. This Element discusses research that uncovers multisensory interactions under complex environments, with an emphasis on the interplay of multisensory mechanisms with other processes.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.