Previous research shows that a single visual working memory item can guide visual attention towards objects that match the feature held in mind, but the results are mixed as to whether this attentional guidance occurs for multiple concurrently active working memory items or not. Evidence in favor of a single-item guidance account has been taken as evidence for the structure of working memory comprising multiple distinct states where one item is prioritized over all others by being placed within a special focus of attention. The present study was designed to test attentional guidance effects for single and multiple working memory items, and to test the hypothesis that there are special distinct states in working memory. To do so, we asked participants to remember one or two colors, then perform a visual search task, and then report the items held in mind. We demonstrate that a single working memory item robustly biases attention towards items that match the color maintained in working memory during visual search (Exp. 1). Although we found reliable guidance when participants remembered two items, we show that these effects can largely be explained by a single item guiding attention on a proportion of trials (Exp. 2). Next, by precisely measuring memory for individual items we show that items naturally vary in their representational fidelity, and that only the item with the strongest representation guides attention (Exp. 3). Importantly, we demonstrate that no special focus of attention is necessary to explain these single-item guidance effects but that natural variation in the fidelity between items — which arises through independent noise — can account for the effects (Exp. 4). These findings challenge current models of working memory guidance and propose a simpler account for how working memory and attention interact: through natural variation in the representational fidelity of memories, one item tends to be the dominant item guiding attention on any individual trial.
Visual search benefits from advance knowledge of non-target features. However, it is unknown whether these negatively cued features are suppressed in advance (proactively) or during search (reactively). To test this, we presented color cues varying from trial-to-trial that predicted target or non-target colors. Experiment 1 (N=96) showed that both target and nontarget cues speeded search. To test whether attention proactively modified cued feature representations, in Experiment 2 (N=200), we interleaved color probe trials with search and had participants detect the color of a briefly presented ring that could either match the cued color or not. Interestingly, people detected both positively and negatively cued colors better than other colors, indicating that to-be-attended and to-be-ignored features were both proactively enhanced. These results demonstrate that nontarget features are not suppressed proactively, and instead support reactive accounts in which anticipated nontarget features are ignored via strategic enhancement.
While many theories of attention highlight the importance of similarity between target and distractor items for selection, few studies have directly quantified the function underlying this relationship. Across two commonly used tasks—visual search and sustained attention—we investigated how target-distractor similarity impacts feature-based attentional selection, in particular asking whether stimulus-based or psychological similarity better explains performance. We found that both similarity measures were non-linearly related to task performance, although psychological similarity explained a big portion of the non-linearities observed in the data, suggesting that measures of psychological similarity are more appropriate when studying effects of target-distractor similarities. Importantly, we found comparable patterns of performance in both visual search and sustained feature-based attention tasks, with performance (RTs and d’, respectively) plateauing at medium target-distractor distances and exponential functions capturing the relationship between stimulus-based and psychological similarity and performance well. In contrast, visual search efficiency, as measured by search slopes, was affected by only a narrow range of similarity levels (10-20°). These findings place novel constraints on models of selective attention and emphasize the importance of considering the similarity structure of the feature space. Broadly, the non-linear effects of similarity on attention are consistent with accounts that propose attention exaggerates the distance between competing representations, possibly through enhancement of off-tuned neurons.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.