2021 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR) 2021
DOI: 10.1109/cvpr46437.2021.01436
|View full text |Cite
|
Sign up to set email alerts
|

Monocular Depth Estimation via Listwise Ranking using the Plackett-Luce Model

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
5
0

Year Published

2022
2022
2023
2023

Publication Types

Select...
3
2
2

Relationship

0
7

Authors

Journals

citations
Cited by 11 publications
(6 citation statements)
references
References 19 publications
0
5
0
Order By: Relevance
“…In relative depth estimation, the scale-invariant loss [13] and its variants [33,34,40,48] have been used to cope with the scale ambiguity of depth labels. Recently, listwise ranking [35], instead of pairwise ranking, and depth normalization [23] have been considered.…”
Section: Relative Depth Estimationmentioning
confidence: 99%
See 3 more Smart Citations
“…In relative depth estimation, the scale-invariant loss [13] and its variants [33,34,40,48] have been used to cope with the scale ambiguity of depth labels. Recently, listwise ranking [35], instead of pairwise ranking, and depth normalization [23] have been considered.…”
Section: Relative Depth Estimationmentioning
confidence: 99%
“…There have been some attempts for this conversion. For instance, relative depths were fitted to metric depths directly using least-squares in [35,40], and a relative depth estimator was fine-tuned to a metric depth estimator in [39]. Also, relative and metric depths were jointly learned through depth map decomposition in [23].…”
Section: Relative-to-metric Depth Conversionmentioning
confidence: 99%
See 2 more Smart Citations
“…As a result, modern general-purpose SVDE methods trained on stereo data output predictions that cannot be used to recover 3D geometry [17,26,35]. Hence, we describe these methods as not geometry-preserving and their predictions as geometrically incorrect.…”
Section: Introductionmentioning
confidence: 99%