2011
DOI: 10.1167/11.1.24
|View full text |Cite
|
Sign up to set email alerts
|

Visually guided pointing movements are driven by the salience map

Abstract: Visual salience maps are assumed to mediate target selection decisions in a motor-unspecific manner; accordingly, modulations of salience influence yes/no target detection or left/right localization responses in manual key-press search tasks, as well as ocular or skeletal movements to the target. Although widely accepted, this core assumption is based on little psychophysical evidence. At least four modulations of salience are known to influence the speed of visual search for feature singletons: (i) feature co… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1

Citation Types

0
15
1

Year Published

2011
2011
2021
2021

Publication Types

Select...
9

Relationship

0
9

Authors

Journals

citations
Cited by 20 publications
(16 citation statements)
references
References 70 publications
0
15
1
Order By: Relevance
“…Zehetleitner, Hegenloh, and Müller (2011) observed that the pattern of RTs and MTs in a reaching task mirrored RTs in a detection task. In the reaching task, participants had to touch the odd element in a search display, while a buttonpress was required in the detection task.…”
Section: Discussionmentioning
confidence: 94%
“…Zehetleitner, Hegenloh, and Müller (2011) observed that the pattern of RTs and MTs in a reaching task mirrored RTs in a detection task. In the reaching task, participants had to touch the odd element in a search display, while a buttonpress was required in the detection task.…”
Section: Discussionmentioning
confidence: 94%
“…Despite these empirical shortcomings of the original implementation of the salience model (and of similar models), conspicuity-based accounts continue to feature prominently in much of the recent work on eye guidance (e.g., Xu, Yang, & Tsien, 2010; Yanulevskaya, Marsman, Cornelissen, & Geusebroek, 2010; Zehetleitner, Hegenloh, & Mueller, 2011; Zhao & Koch, 2011, and many others).Recent special issues of Cognitive Computation (Taylor & Cutsuridis, 2011) and of Visual Cognition (Tatler, 2009) reflect the continuing prominence of image salience and similar conspicuity-based factors in current research. Indeed, even recent emerging models often continue to retain a key role for visual conspicuity (e.g., Ehinger, Hidalgo-Sotelo, Torralba, & Oliva, 2009; Kanan, Tong, Zhang, & Cottrell, 2009), a point we will return to later in this article.…”
Section: Image Salience and Eye Movement Behaviormentioning
confidence: 99%
“…Furthermore, PoP shows similar effects for attention, eye movements, and reach movements respectively, and even transfers from one type of hand movement to another (Moher & Song, 2014). Thus, it is plausible that a shared, motor-unspecific priority map (e.g., Zehetleitner, Hengeloh, & Müller, 2011; see also, Song, Takahashi, & McPeek, 2007) is responsible for biasing attention towards recently selected target features regardless of the mode of action required. However, some theories of PoP suggest that it is not just a target feature that is encoded in memory, but rather an entire set of events from a previous trial that is encoded and biases subsequent target selection (e.g., Hillstrom, 2000; Huang, Holcombe, & Pashler, 2004).…”
mentioning
confidence: 99%