2013
DOI: 10.1098/rstb.2013.0058
|View full text |Cite
|
Sign up to set email alerts
|

Modelling eye movements in a categorical search task

Abstract: We introduce a model of eye movements during categorical search, the task of finding and recognizing categorically defined targets. It extends a previous model of eye movements during search (target acquisition model, TAM) by using distances from an support vector machine classification boundary to create probability maps indicating pixel-by-pixel evidence for the target category in search images. Other additions include functionality enabling target-absent searches, and a fixation-based blurring of the search… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
41
0

Year Published

2015
2015
2022
2022

Publication Types

Select...
5
3

Relationship

0
8

Authors

Journals

citations
Cited by 43 publications
(42 citation statements)
references
References 64 publications
0
41
0
Order By: Relevance
“…However, when the oculomotor system computes a saliency map for each saccade, it can only work with retinotopic representations of local image statistics in the visual cortex. Thus, models of priority that take into account degraded representations of peripheral information (e.g., Hooge & Erkelens, 1999;Zelinsky et al, 2013) and analysis methods that look for effects of visual features at different saccade lengths might provide insight into the phenomena underlying the computation of priority in the brain. To model the peripheral sensitivity of the retina in a more realistic manner, we implemented a simple blurring filter.…”
Section: Modeling Peripheral Visual Acuity With a Retinal Transformmentioning
confidence: 99%
“…However, when the oculomotor system computes a saliency map for each saccade, it can only work with retinotopic representations of local image statistics in the visual cortex. Thus, models of priority that take into account degraded representations of peripheral information (e.g., Hooge & Erkelens, 1999;Zelinsky et al, 2013) and analysis methods that look for effects of visual features at different saccade lengths might provide insight into the phenomena underlying the computation of priority in the brain. To model the peripheral sensitivity of the retina in a more realistic manner, we implemented a simple blurring filter.…”
Section: Modeling Peripheral Visual Acuity With a Retinal Transformmentioning
confidence: 99%
“…For instance, Zelinsky’s Target Acquisition Model (TAM; Zelinsky, 2008) relies on visual similarity to drive its behavior. Specifically, TAM computes the similarity between the search target and visual information in the search display (using image processing techniques that represent scenes in a biologically plausible way) to determine where the model “looks.” Similarity can be computed with respect to a particular target exemplar (Zelinsky, 2008) or in reference to a target category (Zelinsky, Adeli, Peng, & Samaras, 2013). TAM has been shown to successfully capture the eye-movement behavior of human observers across a range of manipulations (e.g., differences in target-distractor similarity) and ranges of complexity (e.g., simple alphabetic letter search, complex real-world scenes).…”
Section: Visual Search and Similaritymentioning
confidence: 99%
“…In laboratory search experiments, participants can often form a near-perfect target template, because they are typically shown, prior to search, a veridical representation of the target as it will appear in the search display. However, in the real world, targets are less well defined because specific details often are hard to predict (e.g., you are looking for a pepper, but do not know what kind), because things often change appearance relative to the last time they were encountered (e.g., when picking your friend up from the airport, you may be surprised to find he shaved his beard), and so on (Zelinsky et al, 2013a). …”
Section: Documented Examples Of Usagementioning
confidence: 99%
See 1 more Smart Citation
“…Additionally, there exists a research area focusing on the human eye movement patterns during the perception of scenes and objects. It can be based on different factors starting from particular culture peculiar properties [186] and up to specific search tasks [187] being in high demand for Big Data visualization purposes.…”
Section: Integration With Augmented and Virtual Realitymentioning
confidence: 99%