Perceptual experience results from a complex interplay of bottom-up input and prior knowledge about the world, yet the extent to which knowledge affects perception, the neural mechanisms underlying these effects, and the stages of processing at which these two sources of information converge, are still unclear. In several experiments we show that language, in the form of verbal cues, both aids recognition of ambiguous "Mooney" images and improves objective visual discrimination performance in a match/non-match task. We then used electroencephalography (EEG) to better understand the mechanisms of this effect. The improved discrimination of images previously labeled was accompanied by a larger occipital-parietal P1 evoked response to the meaningful versus meaningless target stimuli.Time-frequency analysis of the interval between the cue and the target stimulus revealed increases in the power of posterior alpha-band (8-14 Hz) oscillations when the meaning of the stimuli to be compared was trained. The magnitude of the pre-target alpha difference and the P1 amplitude difference were positively correlated across individuals. These results suggest that prior knowledge prepares the brain for upcoming perception via the modulation of alpha-band oscillations, and that this preparatory state influences early (~120 ms) stages of visual processing.Knowledge impacts early perceptual processing appear are reflected in alpha dynamics [36][37][38] . Recently, Mayer and colleagues demonstrated that when the identity of a target letter could be predicted, pre-target alpha power increased over left-lateralized posterior sensors 39 . These findings suggest that alpha-band dynamics are involved in establishing perceptual predictions in anticipation of perception.Here, we examined whether verbal cues that offered no direct perceptual hints can improve visual recognition of indeterminate two-tone Mooney images (Experiment 1). We then measured whether such verbally ascribed meaning affected an objective visual discrimination task (Experiments 2-3). Finally, we recorded electroencephalography (EEG) during the visual discrimination task (Experiment 4) to better understand the locus at which knowledge influenced perception. Our findings suggest that using language to ascribe meaning to ambiguous images impacts early visual processing by biasing pre-target neural activity in the alpha-band.
Materials and Method
Experiment 1Materials. We constructed 71 Mooney images by superimposing familiar images of easily nameable and common artefacts and animals onto patterned background. These superimposed images were then blurred (Gaussian Blur) and then thresholded to a black-and-white bitmap. Materials are available at https://osf.io/stvgy/.Participants. All participants for Experiments 1A-1C were recruited from Amazon Mechanical Turk and were paid $1 (Experiments 1A and 1B), or $0.50 (Experiment 1C) for participating. Demographic information was not collected. All studies were approved by the University of Wisconsin-Madison Institutional Review Boar...