Proceedings of the 27th ACM SIGKDD Conference on Knowledge Discovery &Amp; Data Mining 2021
DOI: 10.1145/3447548.3467061
|View full text |Cite
|
Sign up to set email alerts
|

OpenBox: A Generalized Black-box Optimization Service

Abstract: Black-box optimization (BBO) has a broad range of applications, including automatic machine learning, engineering, physics, and experimental design. However, it remains a challenge for users to apply BBO methods to their problems at hand with existing software packages, in terms of applicability, performance, and efficiency. In this paper, we build OpenBox, an open-source and general-purpose BBO service with improved usability. The modular design behind OpenBox also facilitates flexible abstraction and optimiz… Show more

Help me understand this report
View preprint versions

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
2
1

Citation Types

0
17
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
4

Relationship

3
6

Authors

Journals

citations
Cited by 48 publications
(20 citation statements)
references
References 30 publications
(35 reference statements)
0
17
0
Order By: Relevance
“…DDPG is implemented using PyTorch library [64] with its neural network architecture borrowed from CDBTune [91]. We adopt the OpenBox [50]'s implementation for mixed-kernel BO and GA. For SMAC [34] and TurBO [22], we adopt the implementation released by the authors. Knowledge transfer frameworks.…”
Section: Methodsmentioning
confidence: 99%
See 1 more Smart Citation
“…DDPG is implemented using PyTorch library [64] with its neural network architecture borrowed from CDBTune [91]. We adopt the OpenBox [50]'s implementation for mixed-kernel BO and GA. For SMAC [34] and TurBO [22], we adopt the implementation released by the authors. Knowledge transfer frameworks.…”
Section: Methodsmentioning
confidence: 99%
“…For workload mapping, we implement the methodology in OtterTune. For RGPE [26], we adopt the implementation in OpenBox [50]. We fine-tune DDPG's pretrained models' weight when obtaining the target observations.…”
Section: Methodsmentioning
confidence: 99%
“…For MovieLens, ZOOMER and all graph-based baselines use one-hop neighbors for aggregation. We use OpenBox [45] or follow the original papers to get the optimal hyperparameters of our baselines. For MCCF and FGNN, regulation loss weight is set to 5e-7.…”
Section: A Experimental Setupmentioning
confidence: 99%
“…The experiments are implemented on 4 machines with 14 Intel(R) Xeon(R) CPUs (Gold 5120 @ 2.20GHz) and four NVIDIA TITAN RTX GPUs. The code is written in Python 3.6, and the multi-objective algorithm is implemented based on OpenBox [25]. We use Pytorch 1.7.1 on CUDA 10.1 to train the model on GPU.…”
Section: A4 Experiments Setupmentioning
confidence: 99%