2021
DOI: 10.48550/arxiv.2102.11409
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

On Feature Collapse and Deep Kernel Learning for Single Forward Pass Uncertainty

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

2
68
1

Year Published

2021
2021
2024
2024

Publication Types

Select...
4
3
1

Relationship

0
8

Authors

Journals

citations
Cited by 25 publications
(71 citation statements)
references
References 8 publications
2
68
1
Order By: Relevance
“…A major problem for this task and OOD detection in general is that outliers and inliers may be indistinguishable in the feature space [17]. Feature collapse [16] can be alleviated by forcing the model to learn more informative features. This can be implemented by supplying auxiliary self-supervised loss [17,28] as well as by training on negative data which can be sourced from real datasets [14,29,15] or sampled from generative models [18,20]…”
Section: Open-set Recognitionmentioning
confidence: 99%
See 2 more Smart Citations
“…A major problem for this task and OOD detection in general is that outliers and inliers may be indistinguishable in the feature space [17]. Feature collapse [16] can be alleviated by forcing the model to learn more informative features. This can be implemented by supplying auxiliary self-supervised loss [17,28] as well as by training on negative data which can be sourced from real datasets [14,29,15] or sampled from generative models [18,20]…”
Section: Open-set Recognitionmentioning
confidence: 99%
“…However, the assumption that MC dropout corresponds to Bayesian model sampling may not be satisfied in practice. Anomalies can also be detected by estimating the likelihood in feature space [12], however that approach is vulnerable to feature collapse [16].…”
Section: Dense Open-set Recognitionmentioning
confidence: 99%
See 1 more Smart Citation
“…During inference time, each sample would get logit predictions as well logit variances, both of which are utilized to compute predictive probabilities via mean-field approximation [Lu et al, 2020]. GNN-SNGP: incorporating distance-preserving feature extraction Due to feature collapse in feature extraction [Liu et al, 2020, van Amersfoort et al, 2021, neural representation may not faithfully preserve distance in the input manifold. Liu et al [2020] propose to preserve input distance in feature extraction by applying Spectral Normalization (SN) [Gouk et al, 2021, Miyato et al, 2018 to the residual networks.…”
Section: Gnn Baselinementioning
confidence: 99%
“…However, the problem of uncertainty estimation (UE) in the context of VMP has not been widely covered yet. Previous works [9], [10], [11] consider a limited number of methods and are either disconnected from state-of-the-art approaches in VMP or recent advances in UE field [12], [13], [14] or both. On the other hand, Bayesian Deep Learning field would benefit from benchmarking on large industrial datasets.…”
Section: Introductionmentioning
confidence: 99%