Visual working memory (VWM) is a cognitive memory buffer for temporarily holding, processing, and manipulating visual information. Previous studies have demonstrated mixed results of the effect of depth perception on VWM, with some showing a beneficial effect while others not. In this study, we employed an adapted change detection paradigm to investigate the effects of two depth cues, binocular disparity and relative size. The memory array consisted of a set of pseudo-randomly positioned colored items, and the task was to judge whether the test item was changed compared to the memory item after a retention interval. We found that presenting the items in stereoscopic depth alone hardly affected VWM performance. When combining the two coherent depth cues, a significant larger VWM capacity of the perceptually closer-in-depth items was observed than that of the farther items, but the capacity for the two-depth-planes condition was not significantly different from that for the one-plane condition. Conflicting the two depth cues resulted in cancelling the beneficial effect of presenting items at a closer depth plane. The results indicate that depth perception could affect VWM, and the visual system may have an advantage in maintaining closer-in-depth objects in working memory.
Most studies on visual working memory (VWM) and spatial working memory (SWM) have employed visual stimuli presented at the fronto-parallel plane and few have involved depth perception. VWM is often considered as a memory buffer for temporarily holding and manipulating visual information that relates to visual features of an object, and SWM for holding and manipulating spatial information that concerns the spatial location of an object. Although previous research has investigated the effect of stereoscopic depth on VWM, the question of how depth positions are stored in working memory has not been systematically investigated, leaving gaps in the existing literature on working memory. Here, we explore working memory for depth by using a change detection task. The memory items were presented at various stereoscopic depth planes perpendicular to the line of sight, with one item per depth plane. Participants were asked to make judgments on whether the depth position of the target (one of the memory items) had changed. The results showed a conservative response bias that observers tended to make 'no change' responses when detecting changes in depth. In addition, we found that similar to VWM, the change detection accuracy degraded with the number of memory items presented, but the accuracy was much lower than that reported for VWM, suggesting that the storage for depth information is severely limited and less precise than that for visual information. The detection sensitivity was higher for the nearest and farthest depths and was better when the probe was presented along with the other items originally in the memory array, indicating that how well the to-be-stored depth can be stored in working memory depends on its relation with the other depth positions.
Working memory is considered as a cognitive memory buffer for temporarily holding, processing, and manipulating information. Although working memory for verbal and visual information has been studied extensively in the past literature, few studies have systematically investigated how depth information is stored in working memory. Here, we show that the memory performance for detecting changes in stereoscopic depth is low when there is no change in relative depth order, and the performance is reliably better when depth order is changed. Increasing the magnitude of change only improves memory performance when depth order is kept constant. However, if depth order is changed, the performance remains high, even with a small change magnitude. Our findings suggest that relative depth order is a better indicator for working memory performance than absolute metric depth. The memory representation for individual depth is not independent, but inherently relational, revealing a fundamental organizing principle for depth information in the visual system.
Depth perception is essential for effective interaction with the environment. Although the accuracy of depth perception has been studied extensively, it is unclear how accurate the depth information is stored in working memory. In this study, we investigated the accuracy and systematic biases of depth representation by a delayed estimation task. The memory array consisted of items presented at various stereoscopic depth positions, and the participants were instructed to estimate the depth position of one target item after a retention interval. We examined the effect of spatial configuration by comparing the memory performance in the whole‐display condition where non‐target memory items were present during retrieval with that in the single‐display condition where non‐target memory items were absent. In the single‐display condition, we found an overestimation bias that the depth estimates were farther than the corresponding depth positions defined by disparity, and a contraction bias that the stored depth positions near the observer were overestimated and those far from the observer were underestimated. The magnitude of these biases increased with the number of to‐be‐stored items. However, in the whole‐display condition, the overestimation bias was corrected and the contraction bias did not increase with the number of to‐be‐stored items. Our findings suggested that the number of to‐be‐stored items could affect the accuracy of depth working memory, and its effect depended crucially on whether the information of spatial configuration of memory display was available at the retrieval stage.
The capacity of visual working memory (VWM) is found to be extremely limited. Past research shows that VWM can be facilitated by Gestalt principles of grouping, however, it remains controversial whether factors like the type of Gestalt principles, the characteristics of stimuli and the nature of experimental design could affect the beneficial effect of grouping. In particular, studies have shown that perceptual grouping could improve memory performance for a feature that is relevant for grouping, but it is unclear whether the same improvement exists for a feature that is irrelevant for grouping. In this article, an empirical study and a meta-analytic study were conducted to investigate the effect of perceptual grouping on VWM. In the empirical study, we examined the grouping effect by employing a Kanizsa illusion in which memory items were grouped by illusory contour. We found that the memory performance was improved for the grouped items even though the tested feature was grouping irrelevant, and the improvement was not significantly different from the effect of grouping by physical connectedness or by solid occlusion. In the meta-analytic study, we systematically and quantitatively examined the effect of perceptual grouping on VWM by pulling the results from all eligible studies, and found that the beneficial grouping effect was robust but the magnitude of the effect can be affected by several moderators. Factors like the types of grouping methods, the duration and the layout of the memory display, and the characteristics of the tested feature moderated the grouping effect, whereas whether employing a cue or a verbal suppression task did not. Our study suggests that the underlying mechanism of the grouping benefit may be distinct with regard to grouping relevancy of the to-be-stored feature. The grouping effect on VWM may be independent of attention for a grouping relevant feature, but may rely on attentional prioritization for a grouping irrelevant feature.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.
hi@scite.ai
10624 S. Eastern Ave., Ste. A-614
Henderson, NV 89052, USA
Copyright © 2024 scite LLC. All rights reserved.
Made with 💙 for researchers
Part of the Research Solutions Family.