2021
DOI: 10.48550/arxiv.2108.11976
|View full text |Cite
Preprint
|
Sign up to set email alerts
|

JUWELS Booster -- A Supercomputer for Large-Scale AI Research

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
1
1

Citation Types

0
2
0

Year Published

2023
2023
2024
2024

Publication Types

Select...
1
1

Relationship

0
2

Authors

Journals

citations
Cited by 2 publications
(2 citation statements)
references
References 0 publications
0
2
0
Order By: Relevance
“…For this case, the CoP is improving even more rapidly compared to TSP and also here for the BPP with 6 and 7 items the solutions using slack variables (61 and 78 qubits) is beyond range of what can be simulated with quantum computer simulators. The largest simulations up to 43 qubits were performed on JUWELS Booster [52] as they required more than 1/8 PiB of distributed memory and took more than one million core hours. Finally, figure 16(c) shows the CoP for the KP with 5 to 41 items (5 to 41 qubits) with increments of two (same number of qubits).…”
Section: The Copmentioning
confidence: 99%
“…For this case, the CoP is improving even more rapidly compared to TSP and also here for the BPP with 6 and 7 items the solutions using slack variables (61 and 78 qubits) is beyond range of what can be simulated with quantum computer simulators. The largest simulations up to 43 qubits were performed on JUWELS Booster [52] as they required more than 1/8 PiB of distributed memory and took more than one million core hours. Finally, figure 16(c) shows the CoP for the KP with 5 to 41 items (5 to 41 qubits) with increments of two (same number of qubits).…”
Section: The Copmentioning
confidence: 99%
“…where | • | denotes the L1-norm and w represents the weights (see the following section for how the weights are chosen). In practice, we use the implementation of the BFGS algorithm from TensorFlow Probability [60] running on the NVIDIA A100 GPUs of JUWELS Booster [94,95]. If necessary, a suitable initial value for the BFGS algorithm is obtained from Newton-LB applied to the case in which only some elements of #f are used to ensure #x = #f .…”
Section: Hampiepmentioning
confidence: 99%