2020
DOI: 10.48550/arxiv.2002.02797
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

Variational Depth Search in ResNets

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2

Citation Types

0
2
0

Year Published

2023
2023
2023
2023

Publication Types

Select...
1

Relationship

0
1

Authors

Journals

citations
Cited by 1 publication
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…Dikov et al [17] proposed to estimate the neural architecture width and depth through BNN. Antorán et al [18] proposed to search the depth of residual networks in the efficient one-shot NAS framework, where the neural weights and architecture are jointly learned. Antorán et al [19] estimated the depth uncertainty through the probabilistic reasoning over a sequential structure of feed-forward networks.…”
Section: B the Current Vi-based Methods For Ndsmentioning
confidence: 99%
See 1 more Smart Citation
“…Dikov et al [17] proposed to estimate the neural architecture width and depth through BNN. Antorán et al [18] proposed to search the depth of residual networks in the efficient one-shot NAS framework, where the neural weights and architecture are jointly learned. Antorán et al [19] estimated the depth uncertainty through the probabilistic reasoning over a sequential structure of feed-forward networks.…”
Section: B the Current Vi-based Methods For Ndsmentioning
confidence: 99%
“…Most of the current methods [17], [18], [18] approximate the architecture depth posterior with VI based on the meanfield assumption [8], [9], where the neural weights and depth variables are independent. The mean-field assumption can limit the approximation fidelity and introduce the rich-getricher problem, i.e., the shallow networks would dominate the search.…”
Section: B the Current Vi-based Methods For Ndsmentioning
confidence: 99%