2019
DOI: 10.3758/s13414-019-01807-3
|View full text |Cite
|
Sign up to set email alerts
|

Conjunction search: Can we simultaneously bias attention to features and relations?

Abstract: Attention allows selection of sought-after objects by tuning attention in a top-down manner to task-relevant features. Among other possible search modes, attention can be tuned to the exact feature values of a target (e.g., red, large), or to the relative target feature (e.g., reddest, largest item), in which case selection is context dependent. The present study tested whether we can tune attention simultaneously to a specific feature value (e.g., specific size) and a relative target feature (e.g., relative c… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
3

Citation Types

1
5
0

Year Published

2020
2020
2024
2024

Publication Types

Select...
7

Relationship

2
5

Authors

Journals

citations
Cited by 11 publications
(8 citation statements)
references
References 67 publications
(175 reference statements)
1
5
0
Order By: Relevance
“…However, these benefits were quite small and not reliable across all conditions, which indicates no severe limitations in tuning attention to two different targets. With this, the present study is more aligned with the results of previous studies showing that (covert) attention can be tuned simultaneously to two different target features (e.g., Becker, Atalla, & Folk, 2020; Grubert & Eimer, 2015, 2016; Irons et al, 2012).…”
Section: Discussionsupporting
confidence: 89%
See 2 more Smart Citations
“…However, these benefits were quite small and not reliable across all conditions, which indicates no severe limitations in tuning attention to two different targets. With this, the present study is more aligned with the results of previous studies showing that (covert) attention can be tuned simultaneously to two different target features (e.g., Becker, Atalla, & Folk, 2020; Grubert & Eimer, 2015, 2016; Irons et al, 2012).…”
Section: Discussionsupporting
confidence: 89%
“…Collectively, previous studies show a preference for a relational search over feature-specific search, across multiple different target features and dimensions (e.g., color, shape, luminance, and size; Becker, 2010a, 2013a, 2013b; Schönhammer, Grubert, Kerzel, & Becker, 2016), and also in conjunction search tasks (e.g., Becker, Harris, et al, 2017, 2020).…”
mentioning
confidence: 53%
See 1 more Smart Citation
“…Previous studies had already shown that attention is tuned to the relative feature of the target, and not an optimal feature value or the target feature value in the spatial cueing paradigm (e.g., Becker et al, 2010 , 2013 , 2017 ; Becker, Atalla & Folk, in press ; Harris et al, 2013 ; Schoenhammer et al, 2016 ). Moreover, previous visual search studies demonstrated that a relatively matching distractor captures attention and the gaze more strongly than a target-similar distractor, both when the colors vary along the red-yellow dimension (e.g., Becker, 2010 ; Becker et al, 2014 ) and when they vary along the green-blue dimension ( Martin & Becker, 2018 ).…”
Section: Discussionmentioning
confidence: 99%
“…We suggest that observers fixated mostly on more sharp areas, as is also evident from the data (PropFixInAOI), and did not need to adjust their fixation duration significantly. It is important to note that the task nature and the type of search target also play a role in fixation duration [12,71,80]. In the current study, the search target was a simple Gabor-cross in contrast to some sophisticated objects.…”
Section: Discussionmentioning
confidence: 84%