2019
DOI: 10.1371/journal.pcbi.1006829
|View full text |Cite
|
Sign up to set email alerts
|

Modeling second-order boundary perception: A machine learning approach

Abstract: Visual pattern detection and discrimination are essential first steps for scene analysis. Numerous human psychophysical studies have modeled visual pattern detection and discrimination by estimating linear templates for classifying noisy stimuli defined by spatial variations in pixel intensities. However, such methods are poorly suited to understanding sensory processing mechanisms for complex visual stimuli such as second-order boundaries defined by spatial differences in contrast or texture. We introduce a n… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1

Citation Types

3
14
0

Year Published

2019
2019
2024
2024

Publication Types

Select...
5
3

Relationship

3
5

Authors

Journals

citations
Cited by 18 publications
(20 citation statements)
references
References 116 publications
3
14
0
Order By: Relevance
“…We found that many units were identified as blur selective across all levels of the network and we defined a class of units that we refer to as blur-contrast boundary (BCB) units that respond best to oriented or shaped boundaries between regions of high and low blur levels. Our results raise the testable prediction that similar units may be prevalent in vivo, related to second-order boundary processing (Mareschal and Baker, 1998;Dakin and Mareschal, 2000;Baker and Mareschal, 2001;DiMattina and Baker, 2019), and they provide insight into improving simple stimulus sets for testing for blur selectivity.…”
Section: Introductionsupporting
confidence: 66%
“…We found that many units were identified as blur selective across all levels of the network and we defined a class of units that we refer to as blur-contrast boundary (BCB) units that respond best to oriented or shaped boundaries between regions of high and low blur levels. Our results raise the testable prediction that similar units may be prevalent in vivo, related to second-order boundary processing (Mareschal and Baker, 1998;Dakin and Mareschal, 2000;Baker and Mareschal, 2001;DiMattina and Baker, 2019), and they provide insight into improving simple stimulus sets for testing for blur selectivity.…”
Section: Introductionsupporting
confidence: 66%
“…We define and fit a "filterrectify-filter" (FRF) model positing two stages of filtering to data from Experiment 4, and show that this model successfully accounts for observer performance in the task. Previous studies of second-order vision have fit psychophysical data with FRF models (DiMattina & Baker, 2019;Zavitz & Baker, 2013, but here we show that the FRF model can also account for the ability of observers to extract first-order (luminance) information in the presence of masking LSB stimuli.…”
supporting
confidence: 46%
“…Although natural surfaces may have luminance differences which arise due to luminance texture boundaries, many other textural differences do not involve changes in luminance. Micro-pattern orientation, density, and contrast and others all provide powerful segmentation cues (Dakin & Mareschal, 2000; DiMattina & Baker, 2019; Zavitz & Baker, 2013, 2014; Wolfson & Landy, 1995; Motoyoshi & Kingdom, 2007), which must be combined with luminance cues to enable segmentation in natural vision. It is of great interest for future research to understand how luminance textures combine with other cues.…”
Section: Discussionmentioning
confidence: 99%
See 1 more Smart Citation
“…Natural texture boundaries will not only contain differences in luminance, but also differences in various second-order texture cues like orientation (Wolfson & Landy, 1998), micro-pattern density (Zavitz & Baker, 2014), and contrast (Dakin & Mareschal, 2000;DiMattina & Baker, 2019). It is of interest for future work to better understand how luminance cues (step and boundary) interact with these various second-order texture cues for boundary segmentation.…”
Section: Limitations and Future Directionsmentioning
confidence: 99%