Human vision is an active process in which information is sampled during brief periods of stable fixation in between gaze shifts. Foveal analysis serves to identify the currently fixated object and has to be coordinated with a peripheral selection process of the next fixation location. Models of visual search and scene perception typically focus on the latter, without considering foveal processing requirements. We developed a dual-task noise classification technique that enables identification of the information uptake for foveal analysis and peripheral selection within a single fixation. Human observers had to use foveal vision to extract visual feature information (orientation) from different locations for a psychophysical comparison. The selection of to-be-fixated locations was guided by a different feature (luminance contrast). We inserted noise in both visual features and identified the uptake of information by looking at correlations between the noise at different points in time and behavior. Our data show that foveal analysis and peripheral selection proceeded completely in parallel. Peripheral processing stopped some time before the onset of an eye movement, but foveal analysis continued during this period. Variations in the difficulty of foveal processing did not influence the uptake of peripheral information and the efficacy of peripheral selection, suggesting that foveal analysis and peripheral selection operated independently. These results provide important theoretical constraints on how to model target selection in conjunction with foveal object identification: in parallel and independently.A lmost all human visually guided behavior relies on the selective uptake of information, due to sensory and cognitive limitations. On the sensory side, the sampling of visual input by the retinal mosaic of photoreceptors becomes increasingly sparse and irregular away from central vision (1). In addition, fewer cortical neurons are devoted to the analysis of peripheral visual information (cortical magnification) (2, 3). Humans and other animals with so-called foveated visual systems have evolved gazeshifting mechanisms to overcome these limitations. Saccadic eye movements serve to rapidly and efficiently deploy gaze to objects and regions of interest in the visual field. Sampling the environment appropriately with gaze is the starting point of adaptive visual-motor behavior (4, 5).Studies have shown that saccadic eye movements are guided by analysis of information in the visual periphery up to 80-100 ms before saccade execution (6-8). However, active vision typically requires humans not only also to analyze information in the visual periphery to decide where to fixate next (peripheral selection), but also to analyze the information at the current fixation location (foveal analysis). Not much is known about how foveal analysis and peripheral selection are coordinated and interact. In this regard, we need to know (i) whether and to what extent foveal analysis and peripheral selection are constrained by a common bo...