2020
DOI: 10.1007/978-3-030-58523-5_26
|View full text |Cite
|
Sign up to set email alerts
|

NAS-DIP: Learning Deep Image Prior with Neural Architecture Search

Abstract: Super-ResDenoising Inpainting Dehazing Translation Fig. 1: Applications. We propose to learn deep image prior using a neural architecture search. The resulting network can be applied to solve various inverse image problems without training the model with a large-scale dataset with ground truth. Through extensive experimental evaluations, we show that our model compares favorably against existing hand-crafted CNN models for learning-free image restoration tasks and in some cases even reaches competitive perform… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
20
0

Year Published

2020
2020
2023
2023

Publication Types

Select...
4
2
2
1

Relationship

0
9

Authors

Journals

citations
Cited by 40 publications
(21 citation statements)
references
References 69 publications
(164 reference statements)
0
20
0
Order By: Relevance
“…Design of Architecture Search Space: The intrinsic capability of an evolutionary DNN construction method heavily depends on the search space, which is often manually specified. Recently, there emerges a growing interest in automatically designing the search space of model architecture [200]- [202]. On the one hand, a simpler search space may facilitate the optimization process carried out therein.…”
Section: Trendsmentioning
confidence: 99%
“…Design of Architecture Search Space: The intrinsic capability of an evolutionary DNN construction method heavily depends on the search space, which is often manually specified. Recently, there emerges a growing interest in automatically designing the search space of model architecture [200]- [202]. On the one hand, a simpler search space may facilitate the optimization process carried out therein.…”
Section: Trendsmentioning
confidence: 99%
“…The output at the first iteration contains randomization due to random code vector z and the optimizer iteratively optimizes neural network's parameters θ for the given input image using backpropagation. It has been shown in the literature [19], [36] that the choice of neural network architectures has a direct impact on the performance, e.g., we can design/handcraft a particular neural network architecture for modeling a specific image [21]. This serves as a solution space when modeling images using untrained neural networks priors.…”
Section: Untrained and Pretrained Cdipsmentioning
confidence: 99%
“…The output at the first iteration contains randomization due to random code vector z and the optimizer iteratively optimizes neural network's parameters θ for the given input image using backpropagation. It has been shown in the literature [19], [36] that the choice of neural network architectures has a direct impact on the performance, e.g., we can design/handcraft a particular neural network architecture for modeling a specific image [21]. This serves as a solution space when modeling images using untrained neural networks priors.…”
Section: Untrained and Pretrained Cdipsmentioning
confidence: 99%