2006
DOI: 10.1016/j.visres.2006.08.022
|View full text |Cite
|
Sign up to set email alerts
|

Depth scaling in phantom and monocular gap stereograms using absolute distance information

Abstract: The present study aimed to investigate whether the visual system scales apparent depth from binocularly unmatched features by using absolute distance information. In Experiment 1 we examined the effect of convergence on perceived depth in phantom stereograms [Gillam, B., & Nakayama, K. (1999). Quantitative depth for a phantom surface can be based on cyclopean occlusion cues alone. Vision Research, 39, 109-112.], monocular gap stereograms [Pianta, M. J., & Gillam, B. J. (2003a). Monocular gap stereopsis: manipu… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1
1
1

Citation Types

0
6
0

Year Published

2009
2009
2013
2013

Publication Types

Select...
3
1

Relationship

0
4

Authors

Journals

citations
Cited by 4 publications
(6 citation statements)
references
References 32 publications
0
6
0
Order By: Relevance
“…Another hint that there may be a separate mechanism for monocular gap stereopsis comes from work on how depth from monocular regions is scaled by changes in accommodation and/or viewing distance. Kuroki & Nakamizo (2006) showed that depth does not scale with distance for monocular gap stereopsis, as it does for other examples of monocular occlusion depth, and as it does for standard binocular disparity. Recently, models have been developed that use the output of disparity-detectors in ways that could make use of monocular gaps (see modelling section below, in particular Grossberg and Howe, 2003;Cao and Grossberg, 2005).…”
Section: Do Monocular Regions Provide Evidence For a Separate Sophismentioning
confidence: 96%
“…Another hint that there may be a separate mechanism for monocular gap stereopsis comes from work on how depth from monocular regions is scaled by changes in accommodation and/or viewing distance. Kuroki & Nakamizo (2006) showed that depth does not scale with distance for monocular gap stereopsis, as it does for other examples of monocular occlusion depth, and as it does for standard binocular disparity. Recently, models have been developed that use the output of disparity-detectors in ways that could make use of monocular gaps (see modelling section below, in particular Grossberg and Howe, 2003;Cao and Grossberg, 2005).…”
Section: Do Monocular Regions Provide Evidence For a Separate Sophismentioning
confidence: 96%
“…Häkkinen and Nyman (2001) showed that it supports visual capture (Figure 3.23). It also shows scaling with changes in vergence that are very similar to those found with an equivalent disparity-defined rectangle (Kuroki and Nakamizo, 2006). These many similarities with disparity-based stereopsis for a stimulus that has no disparity anywhere are particularly challenging to the view that depth based on monocular regions is a distinct process from disparity-defined depth.…”
Section: Phantom Stereopsismentioning
confidence: 69%
“…The phantom rectangle ''accounts for'' the monocular gaps. The perceived depth of the phantom increases with increasing width of the bars, although it is seen at a considerably greater depth than that predicted by the minimum depth constraint Grove, Gillam, & Ono, 2002;Kuroki & Nakamizo, 2006;Mitsudo, Nakamizo, & Ono, 2005).…”
Section: Phantom Surfaces and Monocular Regionsmentioning
confidence: 74%
“…These two forms of constraint are involved to different extents in the depth perceived in experimental stimuli with monocular regions. At one extreme is the phantom rectangle Grove, Gillam, & Ono, 2002;Kuroki & Nakamizo, 2006;Mitsudo, Nakamizo, & Ono, 2005), which generates quantitative depth based on monocular regions despite a complete absence of disparities in the stimulus. At the other extreme are stimuli initially thought to demonstrate depth from monocular regions where the depth has since been attributed to matched disparate features (Liu, Stevenson, & Schor, 1994.…”
Section: Discussionmentioning
confidence: 99%