2013
DOI: 10.3758/s13414-013-0436-y
|View full text |Cite
|
Sign up to set email alerts
|

The speed and accuracy of material recognition in natural images

Abstract: We studied the time course of material categorization in natural images relative to superordinate and basiclevel object categorization, using a backward-masking paradigm. We manipulated several low-level features of the images-including luminance, contrast, and color-to assess their potential contributions. The results showed that the speed of material categorization was roughly comparable to the speed of basic-level object categorization, but slower than that of superordinate object categorization. The perfor… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

3
38
0

Year Published

2014
2014
2022
2022

Publication Types

Select...
7

Relationship

1
6

Authors

Journals

citations
Cited by 45 publications
(42 citation statements)
references
References 35 publications
(36 reference statements)
3
38
0
Order By: Relevance
“…Since we first presented these findings (Sharan, Rosenholtz, & Adelson, 2009), others have validated our results and gone on to demonstrate that while material categorization is fast and accurate, it is less accurate than basic-level object categorization (Wiebel, Valsecchi, & Gegenfurtner, 2013) and that visual search for material categories is inefficient (Wolfe & Myers, 2010). It has been shown that correlations exist between material categories and perceived material qualities such as glossiness, transparency, roughness, hardness, coldness, etc.…”
Section: Introductionsupporting
confidence: 70%
See 2 more Smart Citations
“…Since we first presented these findings (Sharan, Rosenholtz, & Adelson, 2009), others have validated our results and gone on to demonstrate that while material categorization is fast and accurate, it is less accurate than basic-level object categorization (Wiebel, Valsecchi, & Gegenfurtner, 2013) and that visual search for material categories is inefficient (Wolfe & Myers, 2010). It has been shown that correlations exist between material categories and perceived material qualities such as glossiness, transparency, roughness, hardness, coldness, etc.…”
Section: Introductionsupporting
confidence: 70%
“…We have also examined the relative speeds of object and material categorization in subsequent work (Xiao, Sharan, Rosenholtz, & Adelson, 2011). However, unlike Wiebel et al (2013), we used the same set of images for the object and material tasks, which ensured that low-level image properties stayed the same in both conditions. We found that object categorization (glove vs. handbag) was faster for close-up views and material categorization (leather vs. fabric) was faster for regular views.…”
Section: Materials Categorization Can Be Fastmentioning
confidence: 99%
See 1 more Smart Citation
“…84 images were taken from the material samples used in the visuo-haptic studies of Gegenfurtner (2013, 2015), all of which were photographed under standardized laboratory lighting. 320 images were used in Wiebel, Valsecchi, and Gegenfurtner (2013) and were taken using a Nikon D70 camera (Nikon, Tokyo, Japan) under various indoor and outdoor illumination conditions. In addition, we used the 500 close-up images from the Flickr material database established by Sharan et al (2009) and the 588 images collected from various internet sources for the fMRI study by Jacobs, Baumgartner, and Gegenfurtner (2014).…”
Section: Stimuli and Apparatusmentioning
confidence: 99%
“…Human observers are remarkably good at perceiving the material qualities of objects (for review, see Adelson, 2001;Anderson, 2011;Fleming, 2014) and at making fast and accurate judgments of material categories (Sharan, 2009;Sharan, Rosenholtz, & Adelson, 2009, 2014Wiebel, Valsecchi, & Gegenfurtner, 2013). This is true even though the underlying physical processes, including ray optics and differential geometry, can be quite complex (Blake & Bülthoff, 1990).…”
Section: Introductionmentioning
confidence: 96%