2019
DOI: 10.1145/3306346.3322996
|View full text |Cite
|
Sign up to set email alerts
|

Hyperparameter optimization in black-box image processing using differentiable proxies

Abstract: Nearly every commodity imaging system we directly interact with, or indirectly rely on, leverages power efficient, application-adjustable black-box hardware image signal processing (ISPs) units, running either in dedicated hardware blocks, or as proprietary software modules on programmable hardware. The configuration parameters of these black-box ISPs often have complex interactions with the output image, and must be adjusted prior to deployment according to application-specific quality and performance metrics… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
36
1

Year Published

2020
2020
2024
2024

Publication Types

Select...
3
3
2

Relationship

0
8

Authors

Journals

citations
Cited by 50 publications
(37 citation statements)
references
References 52 publications
0
36
1
Order By: Relevance
“…With surrogate optimization, programmers develop a surrogate of a program, optimize input parameters of that surrogate, then finally plug the optimized input parameters back into the original program. The key benefit of this approach is that surrogate optimization can optimize inputs faster than optimizing inputs directly against the program, due to the potential for faster execution speed of the surrogate and the potential for the surrogate to be differentiable even when the original program is not (allowing for optimizing inputs with gradient descent) [71,77,87].…”
Section: Case Study: Surrogate Optimizationmentioning
confidence: 99%
See 2 more Smart Citations
“…With surrogate optimization, programmers develop a surrogate of a program, optimize input parameters of that surrogate, then finally plug the optimized input parameters back into the original program. The key benefit of this approach is that surrogate optimization can optimize inputs faster than optimizing inputs directly against the program, due to the potential for faster execution speed of the surrogate and the potential for the surrogate to be differentiable even when the original program is not (allowing for optimizing inputs with gradient descent) [71,77,87].…”
Section: Case Study: Surrogate Optimizationmentioning
confidence: 99%
“…Typical examples of surrogates include neural networks [27,71], Gaussian processes [3,69], linear models [22,25], and random forests [38,62]. Of these model architectures, neural surrogates have emerged as a popular design for surrogates in the literature [23,42,71,77,84,87] because for many tasks neural networks are state-of-the-art models that lead to high accuracy [20,47].…”
Section: Introductionmentioning
confidence: 99%
See 1 more Smart Citation
“…We also adopt a Bayesian optimization framework, but for hyperparameter optimization. To the best of our knowledge, the only previous work on hyperparameter optimization in graphics is [Tseng et al 2019], where parameters of image processing hardware, such as threshold values in the denoising module, are optimized with a non-Bayesian approach. Hereafter we will focus on hyperparameter optimization literature from machine learning and robotics.…”
Section: Hyperparameter Optimizationmentioning
confidence: 99%
“…Step 5: the atmospheric light intensity G is used to calculate and optimize the transmission image, then optimize the field depth of the IoTs surveillance video image, and in actual application, even in sunny and foggy weather, the IoTs surveillance video will also be interfered by atmospheric particles. If all the fog is removed, it will reduce the authenticity of the image, so use the adjustment coefficient σ to retain a small amount of fog and optimize the image authenticity [24]. Therefore the projection is set to the following relationship.…”
Section: Calculation Of Atmospheric Dissipation Functionmentioning
confidence: 99%