2021
DOI: 10.1038/s41598-021-82197-1
|View full text |Cite
|
Sign up to set email alerts
|

Adaptive hyperparameter updating for training restricted Boltzmann machines on quantum annealers

Abstract: Restricted Boltzmann Machines (RBMs) have been proposed for developing neural networks for a variety of unsupervised machine learning applications such as image recognition, drug discovery, and materials design. The Boltzmann probability distribution is used as a model to identify network parameters by optimizing the likelihood of predicting an output given hidden states trained on available data. Training such networks often requires sampling over a large probability space that must be approximated during gra… Show more

Help me understand this report

Search citation statements

Order By: Relevance

Paper Sections

Select...
2
1
1
1

Citation Types

0
6
0

Year Published

2021
2021
2024
2024

Publication Types

Select...
5
1
1

Relationship

0
7

Authors

Journals

citations
Cited by 9 publications
(6 citation statements)
references
References 12 publications
(13 reference statements)
0
6
0
Order By: Relevance
“…There is the possibility to treat β as a learnable parameter rather than having to choose a value empirically, as detailed by Xu and Oates in [22]. The method is based on a log-likelihood maximization approach leading to parameter updates of the form (see Appendix C.4 for how one arrives at this result)…”
Section: Learning the Effective Inverse Temperaturementioning
confidence: 99%
See 2 more Smart Citations
“…There is the possibility to treat β as a learnable parameter rather than having to choose a value empirically, as detailed by Xu and Oates in [22]. The method is based on a log-likelihood maximization approach leading to parameter updates of the form (see Appendix C.4 for how one arrives at this result)…”
Section: Learning the Effective Inverse Temperaturementioning
confidence: 99%
“…In recent years, a number of researchers have studied using D-Wave quantum annealers to train Boltzmann machines [22,[26][27][28][29][30][31][32][33]. The most common approach is to train a classical RBM with quantum assistance, i.e., using the annealer to generate the samples in the negative phase rather than using Gibbs sampling.…”
Section: Previous Work In This Fieldmentioning
confidence: 99%
See 1 more Smart Citation
“…degenerate states is the introduction of parallel tempering with isoenergetic cluster updates in Monte Carlo methods 16 or the combination with simulated annealing on a quantum annealer 17,18 . We mention that Boltzmann machines, which can serve as a link between machine learning and statistical thermodynamics 19 , are investigated in the context of QA [20][21][22][23] from a computer science perspective, but to the best of our knowledge, the direct application of QA for classical finite temperature modeling for statistical physics and materials science has not yet been accomplished and is the subject of the present paper.…”
Section: Introductionmentioning
confidence: 99%
“…Another possibility for the fair sampling of ground and degenerate states is the introduction of parallel tempering with isoenergetic cluster updates in Monte Carlo methods 29 or the combination with simulated annealing on a quantum annealer 30 , 31 . We mention that Boltzmann machines, which can serve as a link between machine learning and statistical thermodynamics 32 , are investigated in the context of QA 33 36 from a computer science perspective, but to the best of our knowledge, the direct application of QA for classical finite temperature modeling for statistical physics and materials science has not yet been accomplished and is the subject of the present paper.…”
Section: Introductionmentioning
confidence: 99%