The aim of this study was (1) to provide behavioral evidence for multimodal feature integration in an object recognition task in humans and (2) to characterize the processing stages and the neural structures where multisensory interactions take place. Event-related potentials (ERPs) were recorded from 30 scalp electrodes while subjects performed a forced-choice reaction-time categorization task: At each trial, the subjects had to indicate which of two objects was presented by pressing one of two keys. The two objects were defined by auditory features alone, visual features alone, or the combination of auditory and visual features. Subjects were more accurate and rapid at identifying multimodal than unimodal objects. Spatiotemporal analysis of ERPs and scalp current densities revealed several auditory-visual interaction components temporally, spatially, and functionally distinct before 200 msec poststimulus. The effects observed were (1) in visual areas, new neural activities (as early as 40 msec poststimulus) and modulation (amplitude decrease) of the N185 wave to unimodal visual stimulus, (2) in the auditory cortex, modulation (amplitude increase) of subcomponents of the unimodal auditory N1 wave around 90 to 110 msec, and (3) new neural activity over the right fronto-temporal area (140 to 165 msec). Furthermore, when the subjects were separated into two groups according to their dominant modality to perform the task in unimodal conditions (shortest reaction time criteria), the integration effects were found to be similar for the two groups over the nonspecific fronto-temporal areas, but they clearly differed in the sensory-specific cortices, affecting predominantly the sensory areas of the nondominant modality. Taken together, the results indicate that multisensory integration is mediated by flexible, highly adaptive physiological processes that can take place very early in the sensory processing chain and operate in both sensory-specific and nonspecific cortical structures in different ways.
It has been hypothesized that visual objects could be represented in the brain by a distributed cell assembly synchronized on an oscillatory mode in the ␥-band (20-80 Hz). If this hypothesis is correct, then oscillatory ␥-band activity should appear in any task requiring the activation of an object representation, and in particular when an object representation is held active in short-term memory: sustained ␥-band activity is thus expected during the delay of a delayed-matching-to-sample task. EEG was recorded while subjects performed such a task. Induced (e.g., appearing with a jitter in latency from one trial to the next) ␥-band activity was observed during the delay. In a control task, in which no memorization was required, this activity disappeared. Furthermore, this ␥-band activity during the rehearsal of the first stimulus representation in short-term memory peaked at both occipitotemporal and frontal electrodes. This topography fits with the idea of a synchronized cortical network centered on prefrontal and ventral visual areas. Activities in the ␣ band, in the 15-20 Hz band, and in the averaged evoked potential were also analyzed. The ␥-band activity during the delay can be distinguished from all of these other components of the response, on the basis of either its variations or its topography. It thus seems to be a specific functional component of the response that could correspond to the rehearsal of an object representation in short-term memory.
Does mental imagery involve the activation of representations in the visual system? Systematic effects of imagery on visual signal detection performance have been used to argue that imagery and the perceptual processing of stimuli interact at some common locus of activity (Farah, 1985). However, such a result is neutral with respect to the question of whether the interaction occurs during modality-specific visual processing of the stimulus. If imagery affects stimulus processing at early, modality-specific stages of stimulus representation, this implies that the shared stimulus representations are visual, whereas if imagery affects stimulus processing only at later, amodal stages of stimulus representation, this implies that imagery involves more abstract, postvisual stimulus representations. To distinguish between these two possibilities, we repeated the earlier imagery-perception interaction experiment while recording event-related potentials (ERPs) to stimuli from 16 scalp electrodes. By observing the time course and scalp distribution of the effect of imagery on the ERP to stimuli, we can put constraints on the locus of the shared representations for imagery and perception. An effect of imagery was seen within 200 ms following stimulus presentation, at the latency of the first negative component of the visual ERP, localized at the occipital and posterior temporal regions of the scalp, that is, directly over visual cortex. This finding provides support for the claim that mental images interact with percepts in the visual system proper and hence that mental images are themselves visual representations.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.