Searching for things is an essential part of our everyday life. The way we search gives us clues on how our cognitive processes function. Scientists have used the visual search task to study attention, perception, and memory. Visual search performance depends upon a combination of stimulus-driven, bottom-up information, goal-oriented, top-down information, and selection history bias. It is difficult to separate these factors due to their close interaction. Our current study presents a paradigm to isolate the effects of top-down factors in visual search. In our experiments, we asked subjects to perform two different search tasks. A part of the total trials in each of these tasks had the same bottom-up information. That is, they had the same target, distractor, and target-distractor arrangement. We controlled for selection history bias by having an equivalent proportion of target types for all tasks and randomized the trial-order for each subject. We compared the mean response times for the critical trials, which had identical bottom-up information shared across the two pairs of tasks. The results showed a significant difference in mean response times of critical trials for both our experiments. Thus, this paradigm allows us to compare the difference in top-down guidance when controlling for bottom-up factors. Pairwise comparison of top-down guidance for different features given the same bottom-up information allows us to ask interesting questions such as, “Visual search guidance for which features can or cannot be easily increased by top-down processes?” Answers to these questions can further shed light on the ecological and evolutionary importance of such features in perception.