2018
DOI: 10.48550/arxiv.1811.01791
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Confidence Propagation through CNNs for Guided Sparse Depth Regression

Abdelrahman Eldesokey,
Michael Felsberg,
Fahad Shahbaz Khan

Abstract: Generally, convolutional neural networks (CNNs) process data on a regular grid, e.g. data generated by ordinary cameras. Designing CNNs for sparse and irregularly spaced input data is still an open research problem with numerous applications in autonomous driving, robotics, and surveillance. In this paper, we propose an algebraically-constrained normalized convolution layer for CNNs with highly sparse input that has a smaller number of network parameters compared to related work. We propose novel strategies fo… Show more

Help me understand this report
View published versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
27
0

Year Published

2019
2019
2023
2023

Publication Types

Select...
5
1

Relationship

0
6

Authors

Journals

citations
Cited by 8 publications
(28 citation statements)
references
References 26 publications
0
27
0
Order By: Relevance
“…Deep Neural Networks (DNNs) have become the standard paradigm within most computer vision problems due to their astonishing predictive power compared to previous alternatives. Current applications include many safety-critical tasks, such as street-scene semantic segmentation [10,50,6,52], automotive 3D object detection [48,44,29,42,55] and depth completion [46,33,12]. Since erroneous predictions can have disastrous consequences, such applications require an accurate measure of the predictive uncertainty.…”
Section: Introductionmentioning
confidence: 99%
“…Deep Neural Networks (DNNs) have become the standard paradigm within most computer vision problems due to their astonishing predictive power compared to previous alternatives. Current applications include many safety-critical tasks, such as street-scene semantic segmentation [10,50,6,52], automotive 3D object detection [48,44,29,42,55] and depth completion [46,33,12]. Since erroneous predictions can have disastrous consequences, such applications require an accurate measure of the predictive uncertainty.…”
Section: Introductionmentioning
confidence: 99%
“…Error Metrics (lower, better) [10,15,16,39,50,54,55] using the validation set in [54]. Despite not being the primary focus, our completion approach remains competitive with the state of the art.…”
Section: Methodsmentioning
confidence: 99%
“…Our sparse generator follows an encoder/decoder architecture with every layer containing modules of convolution, BatchNorm and leaky ReLU (slope = 0.2) with skip connections [46] between every pair of corresponding layers in the encoder and the decoder (Figure 2 -SG). The Figure 5: Comparing our depth completion results against [15,40,54,50]. The depth images have been adjusted for better visualization.…”
Section: Implementation Detailsmentioning
confidence: 99%
See 1 more Smart Citation
“…However, their performance on the ranking metric (RMSE ) is much worse than other state-ofthe-art approaches. The approaches in [33,34] run much faster than other state-of-the-art methods, but their performance on metrics are much worse. Both the proposed method and the method in [17] achieve a good balance between accuracy and speed.…”
Section: F Comparision With Existing Methodsmentioning
confidence: 99%